Thursday, June 27, 2013

Bad news for the Meaningful Use initiative


Well, let's see how Farzad et al spin this.
Here’s an alarming fact: the meaningful use dropout rate is already 17%.

A recently published assessment of the government’s April EHR attestation data revealed that 17% of the providers who earned an $18,000 EHR incentive in 2011 did not earn the $12,000 second incentive in 2012. Although the analysis was performed by the venerable Wells Fargo, my immediate response was, “That’s impossible! They must have miscalculated the data.”


So I crunched the numbers for myself, and to my astonishment, the conclusion is absolutely correct. A staggering 17% of the providers who succeeded at demonstrating meaningful use for 90 days were unable to sustain that performance for a full year—the second required reporting period—despite the fact that the program’s requirements remained exactly the same and the providers already had the necessary workflows in place to support those requirements. What makes this fact even more troubling is that the 2011 attesters were typically the early EHR adopters and therefore most experienced in the use of the technology.

A 17% loss rate in any business is wholly unacceptable, and this failure does not portend well for the future of the EHR Incentive Program. If $12,000 proved to be insufficient motivation for physicians with meaningful use experience to meet the relatively low requirements of Stage 1 on an ongoing basis, it would be foolish to expect physicians to muster the wherewithal to meet the increasingly demanding requirements of Stage 2. The incentive for a year’s performance at that point will be a mere $4,000.
Compounding this finding is the fact that 14% of physicians who attested to Stage 1 have already stated that they have no intention of attesting to Stage 2...
- Evan Steele
RECs, recall, were commissioned for "One and Done" -- Our HITECH mandate was to help get providers through Year One, Stage One (and Adopt/Upgrade/Implement on the Medicaid side).

The upshot of this news should be interesting.

But wait! There's more!

At the end of December 2012, federal records showed that 96,000 eligible physicians had received an incentive payment for the early implementation and attestation of an EHR system, out of a total 505,000 eligible doctors. Mytych (pictured) said those systems must be in place and the providers must meet meaningful use rules by 2015, or face potential federal Medicare penalties. (That deadline, he added, is subject to change.)

The current deadline for implementation of Stage 2, requiring practices to have implemented 90 days of consecutive meaningful use, is now set for next year. Mytych, however, predicts that the deadlines and penalties may be pushed back as well after the Department of Health & Human Services reviews the comments on proposed rules. “It would be good to see them push the deadline back but keep in mind that the federal government will make money off this, since they’re making more in penalties than paying in incentives,” he said...

AND THE HITS JUST KEEP ON COMING

Commentary: Concerns about quality improvement organizations actions around meaningful use
June 25, 2013 | James M. Hofert, Partner, Hinshaw & Culbertson LLP and Roy M. Bossen, Partner, Hinshaw & Culbertson LLP and Linnea L Schramm, Associate, Hinshaw & Culbertson LLP and Michael A. Dowell, Partner, Hinshaw & Culbertson LLP


The federal government is pressuring the medical community to reduce patient care costs while improving the quality of patient care to all patients, including Medicare beneficiaries. Congress, recognizing that hospital readmissions are too common and are costly and often avoidable, passed the Hospital Readmission Reduction Program (HRRP), which ties-in readmission metrics to monetary penalties to encourage hospitals to reduce readmission rates. Federal lawmakers also passed the Health Information Technology for Economic and Clinical Health Act (HITECH), which is intended to stimulate the rapid evolution and adoption of information technology in the healthcare industry, promote the development and use of clinical decision support (CDS) treatment algorithms, encourage active provider participation in discharge planning and care to decrease recidivism, and enhance care coordination through provider-patient communication.

Consistent with these legislative strategies, the Center for Medicare and Medicaid Services (CMS) appears to be encouraging contracted quality improvement organizations (QIOs) to adopt quality care principles (meaningful use criteria), created pursuant to HITECH, as additional criteria to be applied in the evaluation of the adequacy of care provided to Medicare beneficiaries under their jurisdiction...


QIOs such as Telligen have essentially associated the principle of “professionally recognized standards of care” under Section 1156 with “meaningful use criteria” enunciated under HITECH along with other historically applied principles of care. The term “professionally recognized standard of care” is not specifically defined by regulators (the Quality Improvement Organization Manual suggests that the term may be equated to evidence-based practices and/or documented consensus statements, best practices and/or identified norms).
The failure of a physician, hospital or other covered institution to implement acceptable corrective action plans can lead to financial penalties or exclusion from reimbursement for services rendered to Medicare patients. This remedy goes beyond HITECH provisions, which simply provide that institutions not presently in compliance are not entitled to incentive payments...


QIOs should be circumspect in sanctioning physicians and covered institutions for violation of “meaningful use” criteria that has yet to be implemented or fully evaluated by CMS. QIOs may need to consider application of a “sliding scale” assessment, at least initially, as it relates to application of “meaningful use” criteria during care reviews, to allow nonuniversity based hospitals as well as safety-net institutions, the necessary time to bring themselves into compliance with evolving EHR, CDS and discharge planning and care requirements incorporated into recently passed healthcare legislation.
I have worked for HealthInsight -- a "QIO" -- three times spanning a 20-year period, and am quite familiar with the ongoing "conflict of interest" beefs that have dogged them. There have long been concerns regarding "voluntary" provider participation in non-judgmental, collaborative "quality improvement" initiatives under the ongoing three year CMS QIO "Scope of Work" contracts with entities also possessing the statutory sanctions hammer (case reviews and bene complaints). A large number of RECs are also QIOs, and this concern has extended to the Meaningful Use program. This (above) takes that issue to an even higher level.

HealthInsight is (a) a three-state QIO (UT, NV, and NM), (b) a bi-state REC (UT and NV), and now (c) the first Nevada HIE (Health Information Exchange). Doesn't take much imagination to see the myriad questions of conflict that arise in such a circumstance.

Tangentially related to this QIO/Meaningful Use dustup is the recent flap over the new Massachusetts law requiring Meaningful Use competency as a condition of physician licensure.

But, in the wake of all this Luddite and otherwise partisan carping cometh ONC...

June 2013
UPDATE ON THE ADOPTION OF HEALTH INFORMATION TECHNOLOGY AND RELATED EFFORTS TO FACILITATE THE ELECTRONIC USE AND EXCHANGE OF HEALTH INFORMATION
 

A REPORT TO CONGRESS Prepared by:
The Office of the National Coordinator for Health Information Technology (ONC) Department of Health and Human Services 200 Independence Avenue SW Washington, DC 20201
EXECUTIVE SUMMARY

OVERVIEW
Information is widely recognized as “the lifeblood of modern medicine.” Health information technology (health IT) has the potential to improve the flow of information across the health care system and serve as the infrastructure to enable care transformation. Health IT comprises technologies — from electronic health records (EHRs) and personal health records (PHRs) to remote monitoring devices and mobile health applications — that can collect, store, and transmit health information. By enabling health information to be used more effectively and efficiently throughout our health system, health IT has the potential to empower providers and patients; make health care and the health system more transparent; enhance the study of care delivery and payment systems; and drive substantial improvements in care, efficiency, and population health.


ONC collaborates with policymakers and stakeholders to address critical issues related to health IT. Working directly with the health IT community, ONC develops consensus-based standards and technologies that facilitate interoperability and health information exchange (HIE). ONC aims to protect the privacy and security of health information and ensure the safe use of health IT in every phase of its development and implementation. The ultimate goal of these efforts is to inspire confidence and trust in health IT. ONC provides expertise, guidance, and resources to ensure that health IT is widely and effectively implemented. ONC also administers a reliable Health IT Certification Program and works closely with CMS to establish the certification criteria for certified EHR technology (CEHRT) that eligible providers must adopt and meaningfully use in order to qualify for incentive payments under the Medicare and Medicaid EHR Incentive Programs.


PROGRESS ON ADOPTION OF EHR TECHNOLOGY & E-PRESCRIBING

Data show steady increases in the adoption of EHRs and key computerized functionalities related to EHR Incentive Programs’ Meaningful Use criteria among office-based physicians and non-federal acute care hospitals.

  • In 2012, nearly three-quarters of office-based physicians (72 percent) had adopted any EHR system. Forty percent of physicians have adopted a “basic” EHR with certain advanced capabilities, more than double the adoption rate in 2009.5 Physicians achieved at least fifty percent adoption rates for 12 of the 15 EHR Incentive Programs’ Stage 1 Meaningful Use core objectives.
  • As of 2012, 44 percent of non-federal acute care hospitals had adopted a “basic” EHR, more than triple the adoption rate of 2009.7 The percent of hospitals with certified EHR technology increased by 18 percent between 2011 and 2012, rising from 72 percent to 85 percent. 8 Hospital adoption rates for Meaningful Use Stage 1  requirements for the EHR Incentive Programs’ ranged from 72 percent to 94 percent.
  • The percent of physicians e-prescribing using an EHR on one of the nation’s largest e-prescribing network (Surescripts) increased almost eight-fold from 7 percent in December 2008 to over half of physicians (54 percent) in December 2012.10 In the same period, the percent of community pharmacies active on the Surescripts network grew from 69 percent to 95 percent. The percent of new and renewal prescriptions sent electronically between 2008 and 2012 has increased ten-fold to approximately 47 percent.
PROGRESS ON MEANINGFUL USE ATTAINMENT

The CMS Medicare and Medicaid EHR Incentive Programs provide financial incentives for the adoption and Meaningful Use of certified EHR technology to improve patient care. CMS established the EHR Incentive Programs through notice and comment rulemaking and created the necessary infrastructure to implement the program in accordance with existing payment policies and program eligibility criteria. CMS regulations spell out the objectives for the Meaningful Use requirements that eligible professionals, eligible hospitals, and CAHs must meet in order to receive an incentive payment.11 In addition to the incentives, eligible professionals, eligible hospitals, and CAHs that fail to demonstrate Meaningful Use of certified EHR technology will be subject to payment adjustments under Medicare beginning in 2015.


As of April 2013, more than 291,000 professionals, representing more than half of the nation’s eligible professionals, have received incentive payments through the EHR Incentive Programs. Over 3,800 hospitals, representing about 80 percent of eligible hospitals, and including Critical Access Hospitals, have received incentive payments through this program as well. ..


The Office of the National Coordinator for Health IT:

Health IT Regional Extension Centers Program (REC): RECs have played a pivotal role in providing technical assistance to providers. The 62 RECs are actively working with over 133,000 primary care providers, surpassing the 2012 HHS High Priority Goal of providing assistance to 100,000 primary care providers...
 "Pivotal." Yeah. And "dispensable."

Really not much new. Full PDF report here.


Data show steady increases in the adoption of EHRs and key computerized functionalities related to EHR Incentive Programs’ Meaningful Use criteria among office-based physicians and non-federal acute care hospitals.

One of the ONC report graphs:

This has become a staple visual representation of Health IT and MU "progress." (At least they wrote "data show" rather than the gauche "data shows".) A "doubling" of ME capacity.

Any problems here?

"...with computerized capabilities to meet Meaningful Use core objectives"?

Not that they met the criteria, just that they had EHRs "capable" of doing so. Well, for one thing, since the deployment of ARRA/HITECH, "Certified" EHRs have really become the Only Game In Town, no? Are any mainstream vendors going to write products that don't meet ONC CHPL criteria?

Moreover, let me call your attention to the ever-astute Margalit Gur-Arie:
Spinning EHR Adoption Numbers

On May 22nd, the Secretary of Health and Human Services (HHS) published a momentous press release announcing that “Doctors and hospitals’ use of health IT more than doubles since 2012”. The release was accompanied by two beautiful graphs, one for physicians and one for hospitals, titled “Adoption of Electronic Health Records by Physicians and Other Providers” and “Adoption of Electronic Health Records by Eligible Hospitals”, respectively. Both graphs, shown below, start at zero (0) adoption in January 2011 and climb rapidly to “[m]ore than 291,000 eligible professionals and over 3,800 eligible hospitals” by April 2013.

Of course, the graph titles are incorrect, since there were plenty of electronic medical records in use well before 2011, and the actual text of the press release does make some references to the world prior to 2011...

[T]he compulsive need to spin everything prompted HHS to declare that “use of health IT more than doubles since 2012”, which is ridiculous, and to put forward questionable historical numbers. A more cautious White House, while sticking with HHS provided numbers and crediting the President with this miracle, declares for no apparent mathematical reason that “adoption of electronic health records doubled among office based physicians from 2008 to 2012 and quadrupled in hospitals”. Of course every industry publication and health policy pundit (not to mention Twitter) is repeating these things, including the New York Times, where Mr. Thomas Friedman in a customary fact-free infomercial for his investor buddies is stating: “According to the Obama administration, thanks to incentives in the recovery act there has been nearly a tripling since 2008 of electronic records installed by office-based physicians, and a quadrupling by hospitals”. So which one is it folks? Doubled? Tripled? Quadrupled? Something bigger? Does it matter?
Read the whole post. Kudos to Ms. Gur-Arie.

ONE MORE THING...



The 82 page ONC report is amply festooned with the obligatory words "interoperable" and "interoperability" -- 59 hits in all. (avg ~0.72 times per page)


I have commented at some length regarding my dubiety over this totemic incantation before. e.g., from my April 25th, 2013 post:
One.Single.Core.Comphrehensive.
Data.Dictionary.Standard


One. Then stand back and watch the Market Work Its Magic in terms of features, functionality, and usability. Let a Thousand RDBMS Schema and Workflow Logic Paths Bloom. Let a Thousand Certified Health IT Systems compete to survive. You need not specify by federal regulation any additional substantive "regulation" of the "means" for achieving the ends that we all agree are desirable and necessary. There are, after all, only three fundamental data types at issue: text (structured, e.g., ICD9, and unstructured, e,g., open-ended SOAP note narrative), numbers (integer and floating-point decimal), and images. All things above that are mere "representations" of the basic data (e.g., text lengths, datetime formats, logical, .tiffs, .jpegs etc). You can't tell me that a world that can live with, e.g., 10,000 ICD-9 codes (going up soon by a factor of 5 or so with the migration to ICD-10) would melt into a puddle on the floor at the prospect of a standard data dictionary comprised of perhaps a similar number of metadata-standardized data elements spanning the gamut of administrative and clinical data definitions cutting across ambulatory and inpatient settings and the numerous medical specialties. We're probably already a good bit of the way there given the certain overlap across systems, just not in any organized fashion.

Think about it.
___

UPDATE

What’s behind the 17% EHR Incentive Program dropout rate?
Jennifer Bresnick

The pace at which eligible providers are leaving the EHR Incentive Program is higher than most high school dropout rates, according to the latest data from CMS. Seventeen percent of providers who collected a 90-day incentive payment in 2011 either decided not to pursue the program for the following full year or were unable to sustain their efforts for that long. Despite the continued increase of new providers joining the program, the rate of failure is astonishing and somewhat troubling. If meaningful use can’t hold on to participants by offering big incentives, what will happen when that money goes away – and what does that say about the deeper issues underlying EHR adoption in the United States?

Some EHR implementations simply detonate on impact, falling prey to poor planning or the dreaded EHR backlash from unhappy physicians and clinical staff. Other providers find that their first EHR system doesn’t meet their needs, and hunt for a replacement that better suits their practice’s style, which might delay their participation in meaningful use by a few months. But after investing in a certified EHR, adapting their workflow to successfully meet the EHR Incentive Program’s requirements for three months, and collecting their money from CMS, why would a provider decide that continuing on that path just isn’t worth it?...

It’s no secret that physicians just want to practice medicine. They want to spend time with their patients and have the freedom to decide how to run their own businesses and take their own notes. Not everyone sees the EHR Incentive Program as an effective way to make medicine smarter and reduce costs. Not everyone agrees with the ONC’s methods of fostering interoperability of health IT systems, or the way CMS is threatening non-compliant providers.

But left to its own devices, it’s unlikely that healthcare would make the necessary sacrifices to bring it into the digital age and realize the benefits of health IT. It’s clear from the haphazard and reluctant ICD-10 transition that people will put off anything that isn’t mandated as long as humanly possible, no matter how inadequate their current state of affairs.


Meaningful use may be painful, and it may be difficult. It may not be perfect.  However, providers who drop out of a program that’s paying them to participate are setting themselves up for an uncertain future, and need to think carefully about frying pans and fires before choosing to leave meaningful use behind for good.
Nice.
___

More to come...

Monday, June 24, 2013

Hiatus, briefly


I've been taking some downtime the across past week to do some serious, arduous daily dawn-to-dusk home and yard improvement work. The Dawg Days of Summer seem to extend to Health IT, so it seemed a good time. My daily Google news searches haven't been yielding much of note. But, I'm getting close to finishing up the yard. My daily Google news searches haven't been yielding much that's "new" lately. to wit:


First Google "meaningful use" search result this morning (default search by "relevance"). It links to a FierceEMR story that's almost a year old.

Whateever. I'm getting close to finishing up the yard (we're prepping to sell the house), and will be back on task here forthwith.

I've also taken up posting on Pinterest.


Just a bit of diversionary fun.
___

More to come...

Saturday, June 15, 2013

" It's not one-size-fits-all"

The interoperability standards in the meaningful use rules have come under fire in recent months for being too weak. But representatives from The Office of the National Coordinator for Health Information Technology (ONC) and the Centers for Medicare and Medicaid Services (CMS) are defending what they call strong provisions to move the industry forward.

Speaking in a June 6 webinar, "CMS and ONC eHealth provider webinar on advancing interoperability," Steve Posnack, director of the ONC's Federal Policy Division, said the stage 2 rules may not deliver full, industry-wide interoperability immediately. Policymakers have to take into account limitations inherent in the regulatory process, as well as provider and vendor readiness to adopt standards.

Posnack compared criticism of the lack of interoperability in the stage 2 rules to people faulting the government for there being no cars on the road that are ready to meet higher gas-mileage standards. There isn't a lot regulatory agencies can do if the industry doesn't produce products that comply with stiffer requirements.

We are making a lot of progress, and we're moving in an incremental and deliberate fashion.
Still, he said progress is being made, and this progress will be more apparent as providers start transitioning to stage 2 of meaningful use.

"It's best to remember it's not one-size-fits-all," Posnack said. "It's not one solvable problem. We're about to get there; it's coming and you're going to see it very soon. We are making a lot of progress, and we're moving in an incremental and deliberate fashion."

The webinar responded to criticism that has been leveled at the meaningful use program in recent months for not doing enough to advance interoperability standards.

In May, a group of six Republican senators sent a letter and white paper to Department of Health and Human Services Secretary Kathleen Sebelius asking the agency to address what the senators see as insufficient policies for supporting interoperability...
"It's best to remember it's not one-size-fits-all," Posnack said. "It's not one solvable problem."

Well, I disagree, with respect to the Big Picture, the national goal. It should be.

One master RDBMS data dictionary standard specifying the precise name, data type, data length, and cardinality (inclusive of no-dupes/no nuls requirements). Yes, of course, incorporate work already done, e.g., ICD-9, ICD-10, CPT, RxNorm, SNOMED-CT, LOINC, and CVX.

Yes, it would necessarily be a large dictionary table. So what? It wouldn't be that large, and it would certainly be manageable. Yes, like any standard, it would be open to revision and appending over time by the Standards Body. No, not every HIT vendor would have to use every data element, only those relevant to the product customer target (beyond those common to all medical disciplines and those required by Meaningful Use and its eventual successor).

"Interoperability"/"data mapping" problem solved. Let all vendors compete on features, functionality, speed, "look and feel" (UX) usability (also UX), price, support, etc.

Instead of the despised opacity of "Vendor Lock" data siloing.


See my August 29th, 2012 post Single source of Truth.

UX improvement props

ScienceBasedMedicine.org just underwent a facelift.

Nice. Improved readability.

MONDAY MORNING UPDATE

HealthcareITnews.com

“There are no plans for any more extensions,” he told attendees, repeating it twice, for emphasis.

The current extension, from Oct. 1, 2013 to Oct. 1, 2014 was partly due to looking at the incremental changes needed in reforming healthcare, and realizing that “sometimes, extensions are needed," Mostashari said. It was also to allow for a crosswalk to be made between ICD-10 and (Systematized Nomenclature Of Medicine Clinical Terms) SNOMED, a more physician friendly systematically organized computer-processable collection of medical terms used for diagnoses...
Yeah. He better hope that the pending national HIX rollouts go smoothly. If they turn out to be the Cluster[bleep] many are predicting (and which the GOP in particular is salivating over), hordes of status quo interests will descend on The Hill to try to scuttle the ICD-10 move.
States running out of time on health insurance exchanges
By Amanda White,  Washington Post, June 16

With the deadline for states to implement Affordable Care Act-mandated health insurance exchanges less than four months away, state governments will need to move fast.

States are having to reevaluate their existing health insurance infrastructures to meet the act’s requirements. They have already received nearly $4 billion in funding for the effort thus far — and can access more dollars through 2014...
__

The endless,. mindless "interoperability" debate just goes on and on and on...

Despite progress on health IT interoperability, tough questions remain
June 17, 2013 | By Susan D. Hall
Data-sharing in healthcare remains difficult, and despite assertions that the industry is on the cusp of a breakthrough, many are impatient with the slow pace of progress.

Those attending the Digital Healthcare Conference in Madison, Wis., last week addressed some of the biggest questions about the sad state of interoperability, according to InformationWeek. Among them:

Shouldn't data standards allow easier sharing across vendor systems? Epic CEO and founder Judy Faulkner (pictured) said that data standards describe only "a very, very small subset of the data that's really there," according to the article. Intermountain Healthcare CIO Marc Probst has told FierceHealthIT that lack of standards has his team redesigning interfaces over and over. At the conference, Jamie Ferguson, vice president of health IT strategy and policy for Kaiser Permanente, however, said that existing standards are "perfectly good" for close to two-thirds of needed records, but that electronic health records tend not to be implemented well based on the standards...
__

MY NEW TWITTER FRIENDS


Gotta love 'em.

JUNE 21st UPDATE

Mostashari: ICD-10, Meaningful Use can be synergistic
June 20, 2013 | By Dan Bowman

Contrary to what many in the health IT industry think, National Coordinator for Health IT Farzad Mostashari said he does not see the transition to ICD-10 as disruptive to the Meaningful Use process.

Instead, Mostashari said in a recent interview with Healthcare IT News, he believes that ICD-10 can serve as a motivation of sorts for providers moving forward with EHR implementation.

"If anything, I'm seeing that if we can get the synergy going … people seeing if I have a Meaningful Use certified EHR, if I have clinical documentation, then it's easier for me to get to ICD-10, then that's another reason for me to move forward on the clinical side," he said.

Still, Mostashari said he thinks there's "money to be made" by vendors who can help ease the ICD-10 transition for frustrated providers. He said that he envisions companies creating tools that can help providers avoid having to remember thousands of codes by instead suggesting a handful of codes to use, depending on a given scenario.

"Anything that eases the burden on frontline clinicians for documentation and coding," Mostashari said. "Those are the kinds of tools I'm thinking of and I'm sure the market will think of many more."

Mostashari's belief that the ICD-10 and Meaningful Use efforts can be synergistic differs from opinions expressed last month by the College of Healthcare Information Executives, which called for a one-year extension of Meaningful Use Stage 2. CHIME CEO Russell Branzell, in a phone conversation with FierceHealthIT, cited ICD-10 as one of several factors in its request...
Another vendor opportunity? Well, we'll see. Who's gonna pay for all of this?

I know: "Asked and Answered."
___

More to come...

Sunday, June 9, 2013

SPC for Lean Newbies

I noted my concern a couple of posts back that the Lean Healthcare Transformation Summit 2013 appeared to be way light on the technical detail issue of SPC (Statistical Process Control) as a core component of the PDSA cycle that is otherwise touted as the foundation for the Lean process.

PDSA should really be "SPDSA" -- Study, Plan, Do, Study, Act.

I guess it's implicit in the "Plan" part (study the current state and incorporate the findings into your Plan). But, I didn't see much evidence of the quantification imperative of that in Orlando.

In fairness, my cautionary dubiety about "Six Sigma" aside, the DMAIC people are on point here (props to the Wiki):
Define
The purpose of this step is to clearly articulate the business problem, goal, potential resources, project scope and high-level project timeline. This information is typically captured within project charter document. Write down what you currently know. Seek to clarify facts, set objectives and form the project team. Define the following:

  • A problem statement
  • The customer(s)
  • Critical to Quality (CTQs) — what are the critical process outputs?
  • The target process subject to DMAIC and other related business processes
  • Project targets or goal
  • Project boundaries or scope
  • A project charter is often created and agreed upon during the Define step.
Measure
The purpose of this step is to objectively establish current baselines as the basis for improvement. This is a data collection step, the purpose of which is to establish process performance baselines. The performance metric baseline(s) from the Measure phase will be compared to the performance metric at the conclusion of the project to determine objectively whether significant improvement has been made. The team decides on what should be measured and how to measure it. It is usual for teams to invest a lot of effort into assessing the suitability of the proposed measurement systems. Good data is at the heart of the DMAIC process:

  • Identify the gap between current and required performance.
  • Collect data to create a process performance capability baseline for the project metric, that is, the process Y(s) (there may be more than one output).
  • Assess the measurement system (for example, a gauge study) for adequate accuracy and precision.
  • Establish a high level process flow baseline. Additional detail can be filled in later.
Analyze
The purpose of this step is to identify, validate and select root cause for elimination. A large number of potential root causes (process inputs, X) of the project problem are identified via root cause analysis (for example a fishbone diagram). The top 3-4 potential root causes are selected using multi-voting or other consensus tool for further validation. A data collection plan is created and data are collected to establish the relative contribution of each root causes to the project metric, Y. This process is repeated until "valid" root causes can be identified. Within Six Sigma, often complex analysis tools are used. However, it is acceptable to use basic tools if these are appropriate. Of the "validated" root causes, all or some can be

  • List and prioritize potential causes of the problem
  • Prioritize the root causes (key process inputs) to pursue in the Improve step
  • Identify how the process inputs (Xs) affect the process outputs (Ys). Data is analyzed to understand the magnitude of contribution of each root cause, X, to the project metric, Y. Statistical tests using p-values accompanied by Histograms, Pareto charts, and line plots are often used to do this.
  • Detailed process maps can be created to help pin-point where in the process the root causes reside, and what might be contributing to the occurrence.
Improve
The purpose of this step is to identify, test and implement a solution to the problem; in part or in whole. Identify creative solutions to eliminate the key root causes in order to fix and prevent process problems. Use brainstorming or techniques like Six Thinking Hats and Random Word. Some projects can utilize complex analysis tools like DOE (Design of Experiments), but try to focus on obvious solutions if these are apparent.

  • Create innovative solutions
  • Focus on the simplest and easiest solutions
  • Test solutions using Plan-Do-Study-Act (PDSA) cycle
  • Based on PDSA results, attempt to anticipate any avoidable risks associated with the "improvement" using FMEA
  • Create a detailed implementation plan
  • Deploy improvements
Control
The purpose of this step is to sustain the gains. Monitor the improvements to ensure continued and sustainable success. Create a control plan. Update documents, business process and training records as required.


A Control chart can be useful during the control stage to assess the stability of the improvements over time.
OK, thought experiment example. I Googled "control chart" and just picked one based on visual appeal.

So, let's call this Current State Customer Support Email Response Cycle Time and do a quick bit of Photoshopping. The idea here is cycle time from date/time receipt of a customer support email request to the time a response is recorded as "delivered" (not opened and read, just "delivered" -- because that's all we control).

I eyeballed and added the 2 sigma upper and lower "warning limit" lines in yellow.














Let's assume we culled a random sample of n=160 out of our support email server inbox. We find a current state of roughly two days response time, ~58 hours worst case. Sample appears to be roughly normally distributed (though we could test for that), and compliant with Gaussian assumptions for our purposes (though 2 CL "outliers" at n=160 begs a question; it's ~5x what we might expect by The Book. Still...).
  • Standard Deviation ("1 sigma") is 3.38 (I had to calculate this from the original data; no biggie).
  • C.V. ("Coefficient of Variation," a.k.a "Relative Standard Deviation" or "RSD") is ~7.1%, meaning we can unremarkably expect +/- 7.1% variation around the mean response time, current process (that's what "standard deviation" means -- expected variability).
  • The variation spread between the UCL and LCL, then, is about 42.5% relative to the mean.
The RSD is simply a measure of variability relative to the mean. It is useful. High RSD is a red flag, given that a core goal of any QI method is reduction of variation.

OK: Notwithstanding that this appears to be (in our thought experiment) a representative baseline random sample (no evident non-zero trendline, one basic marker of process instability), I'd be wanting to drill down deeper. But, that's another, more subtle issue.
For example, might we isolate all of the encounters which are, say, below -1 sigma (quicker response times), and look for any commonalities (i.e., identifiable "special causes")? As I noted in prior posts discussing HEDIS data examples, I might see a nominally random scatter depicting no apparent relation between cost and quality (below, CAD outcomes by cost proxies), but I'd be on the data-mining lookout for anything unique in that first quadrant. What are the people in the high-quality, low-cost segment doing right?

__

OK, so, back to our "control chart," we have some current state data. We then have to decide upon what will constitute a "significant" improvement should we undertake to try a process change. In science, you decide and document this prior to proceeding to your "Do" stage.

The salient (and difficult) question becomes one of declaring something along the lines of "we can reduce response cycle time by 20% with a concomitant reduction in variability" by doing "X".

At this point, "Do X," measure the upshot ("Study"), and "Act" on the basis of your findings.

This stuff is no "thought experiment" abstraction to me. It was my daily life in the 1980's in Oak Ridge (below). I painstakingly wrote the code that rendered this (PDF).









This is admittedly pedestrian"old school" QC stuff, but it's at the heart of being scientific.

ERRATUM

While attending the Lean Healthcare Transformation Summit 2013 "CEO Panel" discussion session last week, I had the irascible thought "my, my, -- what an incredibly diverse group of middle-aged white men." I noted the absence of women CEOs in a tweet.

This just came in my inbox.


THE BLOG COMMENT OF THE DAY, IN A NUTSHELL

Had to Photoshop it.


From a comment on The Health Care Blog today.
__

JUNE 11 MEANINGFUL USE STAGE 2 ITEM


...SNOMED CT clinical terminology is not widely adopted among providers and vendors, yet Stage 2 starts in October 2013 for hospitals. In particular, EHRs don’t capture communication codes present in 2014 CQMs, such as a specific code that conveys among physicians the degree of a medical condition, or “exclusion” codes that give a patient’s reason for declining medication or notes a patient doesn’t qualify for the medication, DeLano explains. Nor are most providers yet familiar with using SNOMED for clinical documentation, he adds.

Further, adopting SNOMED codes for clinical documentation is a major task, not so far from the complexity of ICD-10, DeLano contends, but the time needed to focus on SNOMED isn’t available as the industry adopts ICD-10. There are benefits to using SNOMED, but if providers and vendors aren’t ready for it, then they won’t be able to attest for meaningful use, he notes. “Providers think they are good because they are on a certified EHR product, but won’t get the clinical quality measures they want if the codes aren’t properly mapped.”

Asked if the federal government recognizes a gap in SNOMED readiness for Stage 2, Delano says, “I think there is awareness that there will be a shortfall in the reporting of CQMs.”...
Interesting. Concerns have been voiced over the utility of CQMs. e.g.,
Validity of electronic health record-derived quality measurement for performance monitoring

Amanda Parsons, Colleen McCullough, Jason Wang, and Sarah Shih

J Am Med Inform Assoc. 2012 Jul-Aug; 19(4): 604–609.
Published online 2012 January 16.
...We looked across the 11 clinical quality measures to assess where information was documented. The presence of data recognized for automated quality measurement varied widely, ranging from 10.7% to 99.9% (table 2). Measure components relying on vitals, vaccinations, and medications had the highest proportion of information documented in structured fields recognized by the automated quality measures. The majority of diagnoses for chronic conditions such as diabetes (>91.4% across measures), hypertension (89.3%), ischemic cardiovascular disease (>78.8% across measures) and dyslipidemia (75.1%) were documented in the problem list, a structured field used for automated quality measurement. Patient diagnoses not recognized for inclusion in the measure were recorded in the medical history, assessment, chief complaint, or history of present illness, sections that typically allow for free-text entries.

Diagnostic orders or results for mammogram had the lowest proportion (10.7%) of data recorded in structured fields recognized for automated quality measurement. The majority of the information for breast cancer screening was found as scanned patient documents and diagnostic imaging; both sources of information are not amenable for automated electronic queries.

Nearly half of the information for measures that require a laboratory test result, such as control of hemoglobin A1c and cholesterol, was documented in structured fields recognized for automated quality measurement (range 53.4–63.0%). Similarly, only half of the information regarding patient smoking status (53.4%) was recognized for automated quality measurement.

With the exception of medications, vaccinations, and blood pressure readings, practices varied substantially in where they chose to document the data elements required for automated quality measurement.


In estimating the denominator loss due to unrecognizable documentation, the average practice missed half of the eligible patients for three of the 11 quality measures—hemoglobin A1c control, cholesterol control, and smoking cessation intervention (table 3). No statistically significant differences were observed between the e-chart and EHR automated quality measurement scores in the number of patients captured for the denominator for the remaining eight measures. Current EHR reporting would underreport practice numerators for six of the 11 measures—hemoglobin A1c control, hemoglobin A1c screening, breast cancer screening, cholesterol control, cholesterol screening, and smoking status recorded.

...More studies are needed to assess the validity of EHR-derived quality measures and to ascertain which measures are best calculated using claims or administrative data or a combination of data sources. If provider-specific quality measurements are to be reported and made public, as is the plan for the meaningful use quality measures, further analysis is needed to understand the limitations of these data, particularly if they are prone to underestimation of true provider performance.
See also

Inaccurate quality reports could skew EHR incentives: study
By Maureen McKinney
Posted: January 15, 2013 - 1:00 pm ET


Electronically reported clinical quality measures vary widely in accuracy, an obstacle that could hinder the federal government's electronic health-record incentive program, according to a study appearing in the Annals of Internal Medicine.
The problem could lead to the highest quality providers not being given the intended incentives, the study concluded.

Beginning in 2014, participants in the CMS' EHR incentive program will be required to report quality data via EHRs. Currently, most quality-reporting initiatives rely on administrative billing data, which has drawn criticism for a lack of clinical relevance, or manual record review, which is time-consuming. Many experts have pointed to EHR-extracted quality data as the best representation of actual patient care.

But researchers, using 2008 data from more than 1,100 patients treated at a federally qualified health center, found big gaps in sensitivity from one electronic measure to another. For instance, electronic measures significantly underestimated the health center's rates of pneumococcal vaccinations and appropriate use of asthma medication, when compared with manual record review...
“If electronic reports are not proven to be accurate, their ability to change physicians' behavior to achieve higher quality, the underlying goal, will be undermined,” they said in the study...
CQMs sometimes reek of "Quadrant Three."
___

More to come...

Friday, June 7, 2013

"Data flowing at the speed of trust"?

"Data flowing at the speed of trust"?

That statement was made last year by ONC chief Farzard Mostashari, extolling the promise of HIE (Health Information Exchange). Today, following a long, delayed trip back to Vegas from Orlando, I groggily arose to a total media shitstorm over new revelations regarding personal data apparently flowing to the National Security Agency (NSA) absent any private citizens' knowledge, consent, or "trust."

See my 2008 blog post Privacy and the 4th Amendment amid the "War on Terror"
"You have no privacy, anyway. Get over it."
- Scott McNealy, 1999

I've been studying and writing about privacy issues since grad school in the mid 1980's. I helped kill the original DARPA "Total Information Awareness" proposal. I served on the privacy and security task force for our Nevada HIE, HealtHIE Nevada. I know my HIPAA stuff. I know my 4th Amendment stuff.

Ironic to a significant degree, given that I am so public. Same website address since the 80's. I never post on blogs and news sites anonymously or using some untraceable handle. I'm pretty open.

Will have to watch how this NSA story develops.

MSNBC story link here
Bits of you are all over the Internet. If you've signed into Google and searched, saved a file in your Dropbox folder, made a phone call using Skype, or just woken up in the morning and checked your email, you're leaving a trail of digital crumbs. People who have access to this information — companies powering your emails and Web searches, advertisers who are strategically directing ads at you — can build a picture of who you are, what you like, and what you will probably do next. Revelations about government counter-terrorism programs such as PRISM indicate that federal agents and other operatives may use this data, too.

"Google knows what kinds of porn everyone in the world likes," Bruce Schneier, a security and cryptography expert told NBC News. Not only are companies tracking what you are doing, they are correlating it, he said.

Since news of PRISM broke, the leaders of the tech companies have denied knowledge of government access to their information. At Facebook, one of the world's biggest data collectors, Mark Zuckerberg posted a message that read: "When governments ask Facebook for data, we review each request carefully to make sure they always follow the correct processes and all applicable laws, and then only provide the information if is required by law.
But the law already permits quite a bit of digital sniffing — much of it without a warrant."...
Painting a picture of you
Gather all of these shreds of metadata, apply some algorithms that spot clues in patterns, and you can put together a pretty good idea of who a person is, and what they're up to.

For example, when a group from MIT analyzed location data from cellphones of 1.5 million people in a single country over 15 months, the team could identify individuals simply by knowing where they were on four separate occasions...
Back to health care, and Big Data analytics

I was driving back from the music store today, listening to NPR. They had a piece on the "Health Datapalooza" conference going on while I was at the LEI Lean Conference (mp3 audio embedded below). It featured the ever-so-interesting and garrulous AthenaHealth CEO Jonathan Bush (cousin of GWB).

Below:
Speaking of Big Healthcare Data and "trust"


SUNDAY UPDATE

We now have the next Bradley Manning

Boston.com

A British newspaper Sunday revealed the source of the leak revealing the NSA’s extensive surveillance of US communications.

Edward Snowden, a former CIA technical assistant who now works for a defense contractor with ties to the National Security Agency, asked the Guardian newspaper to reveal his identity.

“I have no intention of hiding who I am because I know I have done nothing wrong,” he told the newspaper in a remarkable, rambling interview that touched on his reasons for the leak, how he took precautions in not revealing documents that could harm particular people, how he became disillusioned, and how he expects his life as he knows it to end. “I don’t want the story to be about me. I want it to be about what the US government is doing.”

“I’m willing to sacrifice all of that because I can’t in good conscience allow the US government to destroy privacy, Internet freedom and basic liberties for people around the world with this massive surveillance machine they’re secretly building.”...
There will be total bipartisan cohesion around making the story all about you, son.


USDOJ hasn't done squat about the trillions of dollars of Wall Street crimes in more than 4 years, but has gone after after this young man in less than 12 hours.

BEWARE OF GEEKS BEARING GIFTS

Below: just got this link from Jon Taplin:

FT.com
On Monday, Barack Obama’s administration begins its court martial of Bradley Manning, the former US army private who uploaded hundreds of thousands of classified documents to WikiLeaks. Reasonable people disagree on whether Mr Manning “aided the enemy” (as President Obama’s prosecutors allege) or is a hero for helping to educate us about Washington’s shadowy drone programme. Most are surprised the White House is demanding a life sentence four years after putting Mr Manning behind bars. In their view, Mr Obama is a self-confessed geek with Silicon Valley’s transparent “Do no evil” values. Yet he regularly betrays these with his “Nixonian” mania for secrecy...
...[W]hile big data brings innovation, it also has dangerous side effects. Culture is already pushing Americans towards “data nudism”. Such currents will only get more acute. Before long, it will be possible to map an individual’s genetic sequencing at an affordable price. No one will be forced to attach their genetic record to online dating profiles. But potential mates may assume that anyone who chooses not to is concealing a genetic disorder.

America’s middle classes are already in thrall to their often capricious credit scores – a determination that is notoriously hard to correct. In a world where the average home will have hundreds of sensors, and where ubiquitous tracking systems can intimately map an individual’s habits, the right to privacy could become an economic tool of survival. Already, US employers often demand a credit score, a drugs test and fingerprinting from many kinds of applicant. In the new digital world, the right to expunge past blemishes may turn into a rumbling civil struggle.
Should such futurology bother Mr Obama? Yes. A century ago, Theodore Roosevelt pushed back against the power of the rail barons and oil titans – the great technological disrupters of his day. Mr Obama should pay closer heed to history. And he should become wary of geeks bearing gifts.

May we live in interesting times, indeed. It will certainly be an interesting Beltway week.
Can't wait to hear what Michele Bachmann has to say about all of this.
___

More to come...



Tuesday, June 4, 2013

Lean Healthcare Transformation Summit 2013


Really looking forward to this. Will be meeting and speaking with the heavy hitters of the Lean movement. Mark Graban will be there. We will certainly have a good discussion.


This book is excellent.
A Minute to Learn, a Lifetime to Master 

The basic concepts of Kaizen might seem simple at first. Ask your employees for ideas. Say yes to most of them. Let people implement their own ideas, but help them as a servant leader, if needed. Document the improvements simply. Recognize and thank people for their improvements. Share the ideas with others. The term “Quick and Easy Kaizen” refers to employees identifying and implementing easy improvements that can be done quickly. Creating, growing, nurturing, and sustaining a Kaizen program is neither quick nor easy in a department or a healthcare system. Leaders need to help initiate and support Kaizen, while working tirelessly to create the conditions that encourage people to openly identify problems and work together with their colleagues on improvement. Kaizen requires leaders at all levels to actively make time to inspire, coach, mentor, and recognize people...
The Kaizen culture is about gaining control of one’s work, workspace, work life, attitude, and destiny. It is about creating a safe and secure future for you and your organization. It is about enabling your organization to become and remain the service provider of choice in an area. It is about thinking and learning how to make the world a better place in every way— starting with healthcare practices, not broadly, but in a specific department and workplace. When people gain more control of their world, they are happier...
In healthcare, the primary customers are patients and their families. Within a Kaizen culture, they are happy with the services being provided because the customers have been studied and understood and the value they want and need is delivered to them each time, exactly when it is needed. Furthermore, as the engagement studies earlier pointed out, a top-ten engagement driver is an organization that focuses on customer satisfaction. Employees define success as high-quality care and great service to patients and families. Employees realize that, if they can deliver better service to patients, they are contributing to revenue growth and the long-term strength of the organization, as well as their own job security.
Studies suggest that high employee satisfaction correlates with patient outcomes and lower rates of medical errors. A Towers Watson study concluded, “It was found that employees’ views of empowerment, career development opportunities and teamwork influenced engagement. Further, employee engagement was a key predictor of patient satisfaction, leading to an increased likelihood that patients would recommend the network’s hospitals to others.” It might seem reasonable to conclude that there is causation, not just correlation between these factors.
"NEWS" UPDATE

Every morning I scour the 'net with a variety of search terms and phrases, e.g., "meaningful use." This beauty just popped up.

Meeting Meaningful Use Criteria and Managing Patient Populations: A National Survey of Practicing Physicians
Catherine M. DesRoches, DrPH; Anne-Marie Audet, MD; Michael Painter, MD; and Karen Donelan, ScD
Background: Meaningful use, as defined by the Centers for Medicare & Medicaid Services, will require the aggregation of patient data to enable population assessment. Little is known about the proportion of physicians who are able to meet meaningful use criteria or their use of electronic health records (EHRs) to manage patient populations.

Objective: To evaluate physicians’ reports of EHR adoption and ease of use and their ability to use EHRs for patient panel management.

Design: National mailed survey of practicing physicians (response rate of 60%).

Setting: Late 2011 and early 2012.

Participants
: 1820 primary care physicians and specialists in office-based practices.

Measurements
: Proportion of physicians who have a basic EHR and meet meaningful use criteria and ease of use of computerized systems designed for patient population management tasks.

Results: A total of 43.5% of physicians reported having a basic EHR, and 9.8% met meaningful use criteria. Computerized systems for managing patient populations were not widespread; fewer than one half of respondents reported the presence of computerized systems for any of the patient population management tasks included in the survey. Physicians with such functionalities reported that these systems varied in ease of use. Physicians with an EHR that met meaningful use criteria were significantly more likely than those not meeting the standard to rate panel management tasks as easy.

Limitation: Ease-of-use measures are subjective.

Conclusion: Few physicians could meet meaningful use criteria in early 2012 and using computerized systems for the panel management tasks was difficult. Results support the growing evidence that using the basic data input capabilities of an EHR does not translate into the greater opportunity that these technologies promise.

Primary Funding Source: Commonwealth Fund and Robert Wood Johnson Foundation.
How much more behind the times can one get? "News" means new information. JUNE 6th update:

This dated and misleading "news" is being uncritically re-reported in the Health IT press.
JUNE 6TH MEANINGFUL USE UPDATE
Meaningful use incentives ascend past $14.5B
June 06, 2013 | Diana Manos, Healthcare IT News
As of the end of April, the federal government has paid out $14.6 billion in EHR incentive payments, according to Robert Anthony, deputy director of the HIT Initiative Group at the Centers for Medicare & Medicaid Services’ Office of E-Health Standards and Services.

At the Health IT Policy Committee meeting on Wednesday, Anthony said the numbers were the most current available and show an increasing number of providers are interested in the program. There were 395,000 eligible providers and hospitals in “active registration” in the federal meaningful use program--out of a total pool of 532,000.

Though Medicaid providers lag behind Medicare providers in the program, CMS is encouraged by the steady increase in Medicaid eligible providers signing up to participate. To date, there are some 13,000 Medicaid meaningful users. “We’re seeing more and more come in month-to-month,” Anthony said. “In April, 3,200 came in and demonstrated meaningful use.”...
__

LEAN SUMMIT REGISTRATION AND RECEPTION

Seamless trip from LAS to MCO. Made it to the hotel just in time for registration and the opening evening reception. Attendance is about 600. I have a feeling I'm about to encounter a lot of fine minds.




JUNE 5th CONFERENCE DAY 1

Very nice keynotes this morning. Lots to continue to think about.

Denis M. Donovan, MD. Interesting fellow. A psychiatrist. Skeptical of Lean getting tipping point traction.



John Toussaint,MD. I have both of his books, hardbound and Kindle editions






Francois de Brantes, HCI3.org
Chet Marchwinski, LEI Communications Director
One submitted comment challenged the assertion that "variation is the enemy." That's a misreading of the intent of that statement. We have to differentiate between random variation and causal variation. Deming 101 ("assignable causes"). Recall? You have to "stabilize" a process by removing all special cause variation, so that all that remains is random variation around a process mean, prior to attempting PDSA, lest you commit the QI sin of "tampering" with an unstable process?

'eh?
__

Below, a track session on "Leveraging Information to Improve Patient Care." From the title, I thought it might entail discussion of data mining techniques using health IT and/or discussion of Lean methods for health IT software QC/QA/QI.

Wrong on both counts. I asked for a show of hands. Not one person representing HIT software development. Neither did there appear to be any stats people in the room. These were all IT Department Ops people. Decent presentation of useful material to an SRO crowd, but not really what I was hoping for. If anything could use a dose of Lean methodology, it's Health IT software development.





Helen Macfie, Pharm.D., SVP, MemorialCare
The venerable Paul O'Neill, former ALCOA head and Treasury Secretary.


Nice main ballroom crowd for Mr. O'Neill's talk.

Erratum: I broached the topic of "Health 2.0" and "Matthew Holt" during breakfast.

Blank stares. Silencio.

Lean, meet Health 2.0. Health 2.0, meet Lean. C'mon, people.

To be fair, Not-Invented-Here Tribalism can be found everywhere one looks, and, the Health 2.0 crowd is more narrowly focused on Health IT. Moreover, as George Packer recently observed in the New Yorker in his fabulous piece "Change the World," a peculiar, eclectic narcissism pervades the high tech Bay Area / Silicon Valley region:
In 1978, the year that I graduated from high school, in Palo Alto, the name Silicon Valley was not in use beyond a small group of tech cognoscenti. Apple Computer had incorporated the previous year, releasing the first popular personal computer, the Apple II. The major technology companies made electronics hardware, and on the way to school I rode my bike through the Stanford Industrial Park, past the offices of Hewlett-Packard, Varian, and Xerox PARC. The neighborhoods of the Santa Clara Valley were dotted with cheap, modern, one-story houses—called Eichlers, after the builder Joseph Eichler—with glass walls, open floor plans, and flat-roofed carports. (Steve Jobs grew up in an imitation Eichler, called a Likeler.) The average house in Palo Alto cost about a hundred and twenty-five thousand dollars. Along the main downtown street, University Avenue—the future address of PayPal, Facebook, and Google—were sports shops, discount variety stores, and several art-house cinemas, together with the shuttered, X-rated Paris Theatre. Across El Camino Real, the Stanford Shopping Center was anchored by Macy’s and Woolworths, with one boutique store—a Victoria’s Secret had opened in 1977— and a parking lot full of Datsuns and Chevy Novas. High-end dining was virtually unknown in Palo Alto, as was the adjective “high-end.” The public schools in the area were excellent and almost universally attended; the few kids I knew who went to private school had somehow messed up, The Valley was thoroughly middle class, egalitarian, pleasant, and a little boring.

Thirty-five years later, the average house in Palo Alto sells for more than two million dollars. The Stanford Shopping Center’s parking lot is a sea of Lexuses and Audis, and their owners are shopping at Burberry and Louis Vuitton. There are fifty or so billionaires and tens of thousands of millionaires in Silicon Valley; last year’s Facebook public stock offering alone created half a dozen more of the former and more than a thousand of the latter. There are also record numbers of poor people, and the past two years have seen a twenty-per-cent rise in homelessness, largely because of the soaring cost of housing. After decades in which the country has become less and less equal, Silicon Valley is one of the most unequal places in America.

Private-school attendance has surged, while public schools in poor communities—such as East Palo Alto, which is mostly cut off from the city by Highway 101—have fallen into disrepair and lack basic supplies. In wealthy districts, the public schools have essentially been privatized; they insulate themselves from Shortfalls in state funding with money raised by foundations they have set up for themselves. In 1983, parents at Woodside Elementary School, which is surrounded by some of the Valley’s wealthiest tech families, started a foundation in order to offset budget cuts resulting from the enactment of Proposition 13, in 1978, which drastically limited California property taxes. The Woodside School Foundation now brings in about two million dollars a year for a school with fewer than five hundred children, and every spring it hosts a gala with a live auction. I attended it two years ago, when the theme was RockStar, and one of Google’s first employees sat at my table after performing in a pickup band called Parental Indiscretion. School benefactors, dressed up as Tina Turner or Jimmy Page, and consuming Jump’n Jack Flash hanger steaks, bid thirteen thousand dollars for Pimp My Hog! (“Ride through town in your very own customized 1996 Harley Davidson XLH1200C Sportster”) and twenty thousand for a tour of the Japanese gardens on the estate of Larry Ellison, the founder of Oracle and the country’s highest-paid chief executive. The climax arrived when a Mad Men Supper Club dinner for sixteen guests—which promised to transport couples back to a time when local residents lived in two-thousand-square-foot houses—sold for forty-three thousand dollars.

The technology industry’s newest wealth is swallowing up the San Francisco Peninsula. If Silicon Valley remains the center of engineering breakthroughs, San Francisco has become a magnet for hundreds of software start-ups, many of them in the South of Market area, where Twitter has its headquarters. (Half the start-ups seem to have been founded by Facebook alumni.) A lot of younger employees of Silicon Valley companies live in the city and commute to work in white, Wi-Fi-equipped company buses, which collect passengers at fifteen or so stops around San Francisco. The buses—whose schedules are withheld from the public—have become a vivid emblem of the tech boom’s stratifying effect in the Bay Area. Rebecca Solnit, who has lived in Sari Francisco for thirty years, recently wrote in The London Review of Books, “Sometimes the Google Bus just seems like one face of Janus-headed capitalism; it contains the people too valuable even to use public transport or drive themselves. Right by the Google bus stop on Cesar Chavez Street immigrant men from Latin America stand waiting for employers in the building trade to scoop them up, or to be arrested and deported by the government.” Some of the city’s hottest restaurants are popping up in the neighborhoods with shuttle stops. Rents there are rising even faster than elsewhere in San Francisco, and in some cases they have doubled in the past year.

The buses carry their wired cargo south to the “campuses” of Google, Facebook, Apple, and other companies, which are designed to be frilly functioning communities, not just places for working. Google’s grounds, in Mountain View—a working-class town when I was growing up—are modelled on the casual, Frisbee-throwing feel of Stanford University, the incubator of Silicon Valley, where the company’s founders met, in grad school. A polychrome Google bike can be picked up anywhere on campus, and left anywhere, so that another employee can use it. Electric cars, kept at a charging station, allow employees to run errands. Facebook’s buildings, in Menlo Park, between 101 and the salt marshes along the Bay, surround a simulated town square whose concrete surface is decorated with the word “HACK,” in letters so large that they can be seen from the air. At Facebook, employees can eat sushi or burritos, lift weights, get a haircut, have their clothes dry-cleaned, and see a dentist, all without leaving work. Apple, meanwhile, plans to spend nearly five billion dollars to build a giant, impenetrable ringed headquarters in the middle of a park that is technically part of Cupertino. These inward-looking places keep tech workers from having even accidental contact with the surrounding community. The design critic Alexandra Lange, in her recent e-book, “The Dot-Com City: Silicon Valley Urbanism,” writes, “The more Silicon Valley tech companies embrace an urban model, the harder it becomes for them to explain why they need to remain aloof. People who don’t have badges aren’t just a security risk.”

The industry’s splendid isolation inspires cognitive dissonance, for it’s an article of faith in Silicon Valley that the technology industry represents something more utopian, and democratic, than mere special-interest groups. The information revolution (the phrase itself conveys a sense of business exceptionalism) emerged from the Bay Area counterculture of the sixties and seventies, influenced by the hobbyists who formed the Homebrew Computer Club and by idealistic engineers like Douglas Engelbart, who helped develop the concept of hypertext and argued that digital networks could boost our “collective I.Q.” From the days of Apple’s inception, the personal computer was seen as a tool for personal liberation; with the arrival of social media on the Internet, digital technology announced itself as a force for global betterment. The phrase “change the world” is tossed around Silicon Valley conversations...
The technology industry, by sequestering itself from the community it inhabits, has transformed the Bay Area without being changed by it—in a sense, without getting its hands dirty. Throughout most of Silicon Valley’s history, its executives have displayed a libertarian instinct to stay as far from politics and government as possible. Reid Hoffman described the attitude this way: “Look what I can do as an individual myself—everyone else should be able to do that, too. I can make a multi-billion-dollar company with a little bit of investment. Why can’t the whole world do that?” But the imperative to change the world has recently led some Silicon Valley leaders to imagine that the values and concepts behind their success can be uploaded to the public sphere...
Technology can be an answer to incompetence and inefficiency. But it has little to say about larger issues of justice and fairness, unless you think that political problems are bugs that can be fixed by engineering rather than fundamental conflicts of interest and value. Evgeny Morazov, in his new book “To Save Everything, Click Here,” calls this belief “solutionism.” Morozov, who is twenty-nine and grew up in a mining town in Belarus, is the fiercest critic of technological optimism in America, tirelessly dismantling the language of its followers. “They want to be ‘open,’ they want to be ‘disruptive,’ they want to ‘innovate,’” Morozov told me. “The open agenda is, in many ways, the opposite of equality and justice. They think anything that helps you to bypass institutions is, by default, empowering or liberating. You might not be able to pay for health care or your insurance, but if you have an app on your phone that alerts you to the fact that you need to exercise more, or you aren’t eating healthily enough, they think they are solving the problem.”

Steven Johnson, the author of many books about technology, recently published “Future Perfect: The Case for Progress in a Networked Age.” Johnson argues that traditional institutions and ideologies are giving way to a new philosophy, called “peer progressivism,” in which collective problems are solved incrementally, through the decentralized activity of countless interconnected equals—a process that mirrors the dynamics of the Internet. In politics, peer progressivism could mean the rise of “citizen journalists” tweeting and posting on social media, or an innovation that Johnson calls “liquid democracy,” which would allow you to transfer your vote to a friend who is more knowledgeable about, say, the school board. In this thin book, Johnson takes progress as a given, without seriously considering counter-arguments about stagnation and decline. It would be foolish to argue that America’s mainstream media and political system are functioning as they should, but it’s worth wondering if “peer networks” really have the answers. An essay in the journal New Media & Society, by Daniel Kreiss, of Yale; Megan Finn, of Berkeley; and Fred Turner, of Stanford, points out that a system of “peer production” could be less egalitarian than the scorned old bureaucracies, in which “a person could achieve the proper credentials and thus social power whether they came from wealth or poverty, an educated family or an ignorant one.” In other words, “peer networks” could restore primacy to “class-based and purely social forms of capital,” returning us to a society in which what really matters is whom you know, not what you could accomplish.

A favorite word in tech circles is “frictionless.” It captures the pleasures of an app so beautifully designed that using it is intuitive, and it evokes a fantasy in which all inefficiencies, annoyances, and grievances have been smoothed out of existence—that is, an apolitical world. Dave Morin, who worked at Apple and Facebook, is the founder of a company called Path—a social network limited to one’s fifty closest friends. In his office, which has a panoramic view of south San Francisco, he said that one of his company’s goals is to make technology increasingly seamless with real life. He described San Francisco as a place where people already live in the future. They can hang out with their friends even when they’re alone. They inhabit a “sharing economy”: they can book a weeklong stay in a cool apartment through Airbnb, which has disrupted the hotel industry, or hire a luxury car anywhere in the city through the mobile app Uber, which has disrupted the taxi industry. “San Francisco is a place where we can go downstairs and get in an Uber and go to dinner at a place that I got a restaurant reservation for halfway there,” Morin said. “And, if not, we could go to my place, and on the way there I could order takeout food from my favorite restaurant on Postmates, and a bike messenger will go and pick it up for me. We’ll watch it happen on the phone. These things are crazy ideas.”

It suddenly occurred to me that the hottest tech start-ups are solving all the problems of being twenty years old, with cash on hand, because that’s who thinks them up...
Read all of it. Worth your time.

JUNE 6TH UPDATE: LEAN SUMMIT DAY 2

CEO panel up first. Good discussion, from a variety of institutional perspectives.


CEO Panel moderator Mark Graban, co-author of the excellent Healthcare Kaizen





LAST KEYNOTE SESSION

Alan Gleghorn, CEO, Christie Clinic

Nice to see someone use the team sports analogy. I use it routinely You have rules, roles, and improvisation in the face of situational variability. The best teams, whether on the court, or field, or in the clinical setting, cultivate ongoing adaptive "court awareness."
Heading home. The always swell Orlando Airport. I got bumped, and got home 4 hours late.
My summary take of the 2013 Lean Healthcare Transformation Summit is quite favorable overall. A couple of the sessions I attended were rather pedestrian, but all of the keynotes and the CEO panel were outstanding. I sincerely thank LEI for having me. I will add more update thoughts as I finish reviewing my notes.

I was struck by the absence of any explicit SPC presentations (Statistical Process Control). I hope these Lean PDSA evangelists uniformly know their basic stats, Old School that such things may be. I would encourage cross-collaboration with the ASQ Statistics Division (of which I'm a long-time member).

LEI needs to work on its staging. The lighting was underwhelming. Take a cue from Health 2.0: brighter spots, and put up racks of backlighting. (Steal staging tech also from HIMSS). Maybe its my old background as a musical stage performer, CCTV producer, and Las Vegas live entertainment photographer that compels me to be aware of such things, but, they matter for effective live event presentation.
___

More to come...

,