Cambridge HealthTech Institute (CHI) invited me to attend their Next Generation Point of Care Diagnostics Conference and I came away thoroughly impressed with the content, speakers, and organization. Since I chair several conferences a year I know how hard it is to pull off a good one so I’d like to thank CHI for a job well done. While I took the notes and attended the event, this post was written by HITSphere‘s Vik Subbu, our Digital Health editor that focuses on Bio IT and Pharma IT. Bio IT, Pharma IT, Health IT, and MedTech are all going to be merging over the next few years and Vik will be helping our audience understand those shifts and what they mean to Digital Health innovators. Here’s Vik’s recap of the conference:

Goals & Attendees

The goal of the event was to provide a progress update to the healthcare industry on the advances in next generation point-of-care (POC) diagnostics while highlighting the advent of innovative platforms and use of digital information systems to aid in the development of novel POC diagnostics. The conference was attended by industry experts from various disciplines ranging from academic institutions, non-profit computational and bioinformatics centers, venture capital, service providers, pharmaceutical, diagnostic and biotechnology companies.

Why does Point of Care Dx matter to Digital Health innovators?

The interactions and cross-fertilization of ideas among various disciplines in the diagnostic arena was the highlight of the conference. The ability to have real time interactions between academic researchers, clinicians, product developers and reimbursement specialists provided a ‘one stop’ venue for an attendee to obtain a holistic overview of both the promises and pitfalls in developing point-of-care diagnostics. The outcome of the conference should yield greater public-private collaborations involving novel platforms, available NGS datasets, and academic laboratories. Such partnerships will hopefully enable the industry to overcome product development and reimbursement barriers while paving the way for effective and streamlined approval process for next generation POC diagnostics. All of this will help integrate POC better into next generation Digital Health innovations.

The intimate setting and the organization of the parallel track discussions/presentations were well designed and covered key aspects of POC diagnostics. For one looking to learn the current and future directions of POC diagnostics, the conference provided a nice platform to learn, understand and meet key contacts to support their individual interests. Entrepreneurs and innovators focusing on bridging the “gap” between healthcare IT and diagnostics will find that there was a recurring theme that surfaced in many of the presentation but wasn’t really the focal point of any one specific presentation. That topic was data. There were many presentations that highlighted the “use of genomic data” or “the use of computational super tools to assimilate or generate vast amounts of data” or “  the need for better data standards to achieve meaningful results”. While these were great presentations, none of the speakers focused on the “HOW” piece (which is a huge opportunity for entrepreneurs). For example, “”how can one can gain broader insights from these datasets?”  or “how can we solve the issues of standardization of datasets?”. Perhaps, this was the homework assignment that we must complete in time for next year’s conference.

Top Ten Insights for Healthcare IT innovators:

  1. Next Generation Sequencing (NGS) will continue to play a vital role in  disease detection and biomarker identification
  2. The increasing availability of publicly available datasets from the FDA and academia will help guide the development of next generation POC diagnostics
  3. Point of care diagnostics for hospital acquired infectious diseases remains an unmet need
  4. Need for improving sensitivity and specificity of diagnostic assay platforms is acute
  5. Reimbursement discussions need to occur with payers from day one
  6. Early stage diagnostic companies can benefit from innovative business models and strategic partnerships
  7. Clinical samples are required to validating an assay or biomarker – yet finding these longitudinal samples remains a challenge
  8. Software tools and POC diagnostics have improved the identification of diseases and better patient outcomes…..but we have a long way to go
  9. Establishing better workflows, processes and teams can lead to better outcomes
  10. Integrating disparate datasets can yield better insights and patient outcomes

{ 1 comment }

Given the number of breaches we’ve seen this Summer at healthcare institutions, I’ve just spent a ton of time recently on several engineering engagements looking at “HIPAA compliant” encryption (HIPAA compliance is in quotes since it’s generally meaningless). Since I’ve heard a number of developers say “we’re HIPAA compliant because we encrypt our data” I wanted to take a moment to unbundle that statement and make sure we all understand what that means. Cryptology in general and encryption specifically are difficult to accomplish; CISOs, CIOs, HIPAA compliance officers shouldn’t just believe vendors who say “we encrypt our data” without asking for elaboration in these areas:

  • Encryption status of data at rest in block storage (the file system that the apps, databases, VMs, are stored on)
  • Encryption status of data at rest in virtual machine block storage
  • Encryption status of data at rest in archived storage (backups)
  • Encryption status of data at rest in the Oracle/SQL*Server/DB2/MySQL/Postgre/(your vendor) databases (which sits on top of the file system)
  • Encryption status of data in transit from database to app server
  • Encryption status of data in transit from app server to proxy server (HTTP server)
  • Encryption status of data in transit from proxy server to end user’s client
  • Encryption status of data in transit from API servers to end user’s clients (iOS, Android, etc.)
  • Encryption status of server to server file transfers
  • Encryption key management in all of the above

When you look at encrypting data, it’s not just “in transit” or “at rest” but can be in transiting or resting in a variety of places.

If you care about security, ask for the details.

{ 2 comments }

These days it’s pretty easy to build almost any kind of software you can imagine — what’s really hard, though, is figuring out what to build. As I work on complex software systems in government, medical devices, healthcare IT, and biomedical IT I find that tackling vague requirements is one of the most pervasive and difficult problems to solve. Even the most experienced developers have a hard time building something that has not been defined well for them; a disciplined software requirements engineering approach is necessary, especially in safety critical systems. One of my colleagues in France, Abder-Rahman Ali, is currently pursuing his Medical Image Analysis Ph.D. and is passionate about applying computer science to medical imaging to come up with algorithms and systems that aid in Computer Aided Diagnosis (CAD). He’s got some brilliant ideas, especially in the use of fuzzy logic and storytelling to elicit better requirements so that CAD may become a reality some day. I asked Abder-Rahman to share with us a series of blog posts about how to tackle the problem of vague requirements. The following is his first installment, focused on storytelling and how it can be used in requirements engineering: 

I remember when I was a child how my grandmother used to tell us those fictional and non-fictional stories. They still ring in my ears, even after those many years that have passed by. We used to just sit down, open our ears, stare our eyes, move around with our thoughts, and we don’t get out of such situation until the story ends. We used to make troubles sometimes, and to get us calm, we were just being called to hear that story, and the feelings above came to use again.

Phebe Cramer, in her book, Storytelling, Narrative, and the Thematic Apperception Test, mentions how storytelling has a long tradition in human history. She highlights what have been considered the significant means by which man told his story. Some of those for instance were the famous epic poems, the Iliad and the Odyssey from the ninth century B.C., the Aeneid from 20 B.C., the east Indian Mahabharata and Ramayana from the fourth century A.C., …etc. This is how history was transmitted from one generation to the other.

Storytelling Tips and Tales emphasizes that stories connect us to the past, and enlighten for us the future, lessons can be learned from stories, and information is transmitted transparently and smoothly through stories. Teachers in schools are even being encouraged to use storytelling at their classrooms. The books also believes that storytelling is an engaging process that is rewarding for both the teller and the listener. Listeners will like enter new worlds by just hearing the words of the teller. Schank and Abelson even see that psychological studies have revealed that human beings learn best from stories, in their Knowledge and Memory: The Real Story.

Having mentioned that, a requirements engineer may ask, why couldn’t we just then bring storytelling to our domain? Especially that in our work, there would be a teller and a listener. Well, could that really be?

Let us examine the relationships between story elements and a software requirement in order to answer that question.

In his book, Telling Stories: A Short Path to Writing Better Software Requirements, Ben Rinzler highlights such relationships as follows (some explanations for the points was also used from Using Storytelling to Record Requirements: Elements for an Effective Requirements Elicitation Approach):

  1. Conflict: This is the problem you want to solve in the requirements process. An example of that is the conflict that occurs between stakeholders needs and the FDA regulatory requirements for some medical device software.
  2. Theme:  This is the central concept underlying the solution. For requirements engineering, this could be a “requirement”, that is, the project goal.
  3. Setting: Knowing that the setting is the place and time of the story. In requirements engineering, this can be stated as the broader concept of the problem at hand, such as providing information about the technology environment, business, …etc.
  4. Plot: The plot of a story is its events that occur in a certain order, such that their outcome affects later once. In requirements engineering, this is the current and future systems’ series of actions.
  5. Character: This refers to any entity capable of action. In requirements engineering, this can for instance represent people, machines, and programs.
  6. Point of view: Having different points of view is important for providing a unified view that tries to provide a whole description of what is actually happening, and what everyone needs. This is like describing a medical device software process from the patient and physician points of view for instance.

So, yes, a relationship and an analogy exists between storytelling and software requirements.

In future posts in the series, Shahid and I will dig more deep on how storytelling could be employed in the requirements engineering process, and will also try to show how can fuzzy logic be embedded in the process to solve any issues that may be inherent in the storytelling method.

Meanwhile, drop us comments if there are specific areas of requirements engineering complex software systems that you’re especially interested in learning more about.

{ 5 comments }

Our vision of providing a series of packed one day events focused on practical, relevant, and actionable health IT advice were very well received in Houston, NYC, and Santa Monica earlier this year. Our next event is in Chicago and we’re going to continue to eschew canned PowerPoint decks which limit conversations and instead deliver on the implications of major trends and operationalizable advice about where to successfully apply IT in healthcare settings. As usual, the blind promotion of tech hype is going to be replaced with and actionable insights that can be put to immediate use. Based on some of the feedback we got from the 3 earlier events this year, it looks like we struck a chord:

“IMN have brought together a one-of-a-kind venue for the HealthIMPACT forum. It offers an opportunity to explore, in-depth, the intersection of emerging models of cloud computing with solving some of our toughest problems in health information technology. It’s a great opportunity to meet national thought leaders and explore these issues at depth in an intimate setting. ” - Keith Toussaint, Executive Director, Business Development, Global Business SolutionsMAYO CLINIC

“You had a pretty engaged group yesterday. I would think you regard the meeting as successful; it was in a beautiful venue. ” – David S. Mendelson, MD, FACR, Co-Chair Integrating the Healthcare Enterprise, Professor of Radiology, Director of Radiology Information Systems Pulmonary Radiology, Senior Associate, Clinical InformaticsMOUNT SINAI MEDICAL CENTER

“[The open format] allows for valuable exchange between participants. The forum consists of important topics and fluid discussions going where the audience wants to take it.” – George Conklin, Senior Vice President and CIOChristus Health

“HealthIMPACT seemed more focused with only high quality contributors and content. HealthIMPACT was collaborative with fewer ‘talking heads’ and more open and honest dialog. I truly felt that it was a more intimate environment for sharing.” – Zachery Jiwa, Innovation FellowUS Department of Health and Human Services

I’m often asked why, as a health IT blogger, I wanted to lead HealthIMPACT. Here’s a three minute video overview that explains my thinking:

Based on the feedback from the Houston, NYC, and Santa Monica events and what we’ve heard from our surveys, below are some of the topics we plan to cover in Chicago on September 8th at HealthIMPACT Midwest.

  • Reckoning with the Challenges of Meaningful Use Stage 2
  • Fear and loathing as well as excitement around new risk-based collaborative payment systems and value based reimbursement
  • Cutting through the Health IT Hype Cycle – The Top Five Things That Matter When You are Running a Health System
  • Using Mobile Applications to Align Caregiver Behavior to Enterprise Initiatives While Improving Patient Satisfaction and Outcomes
  • Doing More with Less – Clinical and Financial Integration Required to Deliver True Population Based Health Management for a Value-Based Reimbursement Environment
  • Interoperability and Coordination of Care across Multiple Providers – Realizing the Value of Health Information Exchange
  • Working With Tech Providers to Build and Implement Technology That Works for Your Physicians, Nurses, and Patients
  • A Look to the Future of Clinical Decision Support and Analytics
  • Using Advanced Analytics to Improve the Patient Experience for your Community
  • Creating the IT Integration Playbook for Success During Mergers and Expansions
  • What You Can Do to Protect Your Organization as You Become More Dependent on Cloud Based Services
  • Innovation Shark Tank – The Questions You Need to Ask and the Questions Vendors Need to be Ready For

All of the prepared agenda items above will be delivered in a unique and novel way so that the audience can drive the direction of the conversation. At HealthIMPACT we ask our audience to keep us honest, and they do. Some of the other topics that will be woven throughout the day include:

Data integration and system interoperability

  • Information exchange between hospital and outside groups/providers
  • Mobile interoperability of Patient Data
  • Interoperability strategies to ensure exchange of quality information
  • HIE Connectivity, Direct Trust Testing/Connectivity
  • Improved communication between providers

Population Health and Patient Engagement

  • How will involvement of patients in their own care change the way healthcare is practiced? Will it really?
  • What efforts are being made to reach out to the average patient in the population so they can access and use the health care system the same way that the average person is able to use the banking or retail system?

Data Governance

  • Ensuring data accuracy
  • Control data output to ensure it is of highest quality and provides consistent outcomes.
  • Data governance, measure burden, data analysis
  • Strategies for accurate and reliable data entry
  • Ensuring the quality of information within your EMR
  • Use of computerized assisted clinical documentation or coding to improve clinical outcomes
  • CAC, Computer Assisted Physician Documentation (CAPD)
  • Master Data Management
  • Reconciliation of data between systems

Meaningful Use

  • Assuring on-time and on-budget completion of projects (principally MU2), in the face of reduced reimbursement and personnel resources.
  • Implementation of MU 2
  • Meeting MU2 and CMS rules w/minimal impact on physician workflow/productivity
  • Transition of Care (TOC) measure and use of CCDA & DIRECT Messaging
  • Developing solutions that will satisfy conflicting requirements between CMS sections, without requiring staff to do multiplicative documentation.
  • Effective Clinical Integration Ideas EHR (Epic Implementation)
  • Epic implementation
  • Interoperability legacy systems and modern systems
  • Keeping track of rapid changes in software in the electronic health record
  • Keeping track of changes from CMS
  • Staying current of IT information that comes so fast
  • Meaningful Use Audits
  • Implementing electronic medical record
  • Successfully attestation for Stage 2 Phase 1 MU
  • Maintaining metrics in the face of ever changing regulatory requirements
  • Transition of the traditional quality core measures to the electronic clinical quality measures
  • Managing changes in workflows as new components in the EHR are implemented to meet meaningful use requirements

Clinical Informatics

  • Use of analytics/data to coordinate care and cut costs
  • Developing Heath Care Data and Analytics division
  • Knowledge of successful strategies to move forward clinical informatics agenda
  • Population Heath and Data Mining
  • Not seeing nursing informatics (N I) working in our healthcare facilities
  • Seeing NI as a leaders in the field.
  • Job availability for NI
  • Ways in which nursing informatics is impacting healthcare
  • The integration of Nursing informatics as a part of IT in healthcare
  • Focus on nursing informatics and their role in healthcare
  • cost big data interoperability

Clinical Decision Support

  • Enabling more robust clinical decision support
  • Exploring, and successfully implementing alternate delivery methods of care

Mobility

  • How to get the most out of mobile platforms.
  • Role of mobile devices in Health IT.
  • Telehealth
  • Clinical solutions and patient engagement solutions
  • How to be successful with cloud strategies

Cost & Resources

  • Ensuring that using IT in care delivery actually helps in reducing cost of healthcare Cutting cost of the contracted services
  • Supporting the education efforts of various departments, without having to assume responsibility for conducting the actual education
  • Prioritizing to corporate strategic direction.
  • Workflow of IT operations area – more efficient
  • How to evaluate new technoloty
  • global sense of what the most useful cutting edge technologies are
  • Resources Money changes in government regulations
  • Project management C-suite expectations Talent acquisition
  • Money to implement, train, maintain. Trained technical people. Affordable bandwidth.
  • Funding; dealing with increasing integration requirements; need for speed in an increasing complicated environment.
  • Budgets Finding qualified staff to fill positions GRC culture change to make the business more responsible for their applications
  • Change management in general

Innovations

  • What start-up technologies are larger institutions potentially looking at?
  • What apps should patients be “prescribed”?
  • Trends, direction in technologies for new technologies like wearable technology etc.

Security

  • System implementation Security
  • Authentication, electronic signature
  • Medical & Personal Device Security
  • Security and Privacy Mobility

{ 0 comments }

To improve patient satisfaction, hospital supply chain units need better IT and next generation technology

I’ve been looking at hospital supply chain automation and the IT surrounding it for a number of years now. Starting with Cardinal Health but then moving on to help a number of other vendors in the space, I’ve felt that there’s not been enough next-generation tech being applied to the low margin, high volume business of hospital […]

1 comment Read the full article →

Guest Article: OLAP remains a great healthcare analytics architecture, even in the Big Data era

I’ve been getting many questions these days about big data tools and solutions, especially their role in healthcare analytics. I think that unless you’re doing large scale analysis of biomedical data such as genomics, it’s probably best to stick with traditional tried and true analytics tools. Online Analytics Processing (OLAP) can be invaluable for medical […]

0 comments Read the full article →

What EHR/PM vendors should do as 63% of buyers look to replace existing PM solutions

Melissa McCormack, a medical researcher with EHR consultancy group Software Advice, recently published their medical practice management BuyerView research, which found that 63% of the buyers were replacing existing PM solutions, rather than making a first-time purchase.  This mirrors the trend we’ve seen across medical software purchasing, where the HITECH Act may have prompted hasty […]

1 comment Read the full article →

Guest Article: HL7 FAQ and why exchanging critical patient data isn’t a nightmare

I recently saw a demo of the Decisions.com platform and left impressed with the workflow engine, business rules execution, forms automation, and data integration platform. I’m very familiar with almost all the major HL7 routers and integration engines out there but Carl Hewitt, Founder and Chief Architect at Decisions, is releasing something fairly unique — an visual HL7 interface definition and […]

0 comments Read the full article →

Guest Article: What EHR buyers and health IT vendors can learn from the Nashville market

Zach Watson over at Technology Advice.com wrote a nice piece on EHR Trends in Nashville. I’m not a big fan of “trends” articles because trends aren’t that important, the implications of those trends and how to operationalize the implications are most important. I enjoyed Zach’s article so I asked him to tell us what those trends mean for […]

0 comments Read the full article →

Guest Article: Is Patient Generated Health Data (PGHD) trustworthy enough to use in health record banks?

The push towards shifting the patient’s role from a passive recipient of care to an active member of the care-team looks set to gain further legislative backing. Earlier this year, the Health IT Standards Committee, along with The Joint Commission and ONC, laid out recommendations for integrating patient generated health data (PGHD) into Stage 3 […]

3 comments Read the full article →