- UID
- 849027
- 在线时间
- 小时
- 注册时间
- 2013-1-15
- 最后登录
- 1970-1-1
- 主题
- 帖子
- 性别
- 保密
|
沙发

楼主 |
发表于 2014-7-7 23:53:49
|
只看该作者
Part II: Speed
Capitalizing on the Power of Big Data for Healthcare
Time2
The increasing digitization of healthcare information is opening new possibilities for providers and payers to enhance the quality of care, improve healthcare outcomes, and reduce costs. Technology advances, regulatory mandates, and government incentives have accelerated the move from paper to digital health records. With information in digital form, healthcare organizations can use available tools and technologies to analyze that information and generate valuable insights.
Bringing together disparate data silos from within an organization can help increase the value of those analyses. The integration of electronic health records (EHRs), medical claims, videos, medical images, scanned documents, and physicians’ notes enables organizations to create a rich, 360-degree view of each patient. Incorporating external sources of data is also key. Integrating social, demographic, environmental, and behavioral information relating to patients allows organizations to discover new correlations that might otherwise have remained hidden.
Creating a more holistic view of each patient and analyzing a wider array of information will help organizations meet the requirements of emerging healthcare models. For example, an increasing number of providers are becoming part of Accountable Care Organizations (ACOs), moving toward fee-for-value payment models, and entering into reimbursement contracts through which they are paid for taking care of a whole person or a whole episode of care. These changes require organizations to better coordinate care among multiple providers so they can improve the efficiency of care.[226]
Time3
Healthcare organizations must also analyze internal and external patient information to more accurately measure risk and outcomes.
At the same time, many providers and payers are working to increase data transparency to produce new insights and facilitate research. Some established providers and payers are forming joint ventures, creating integrated delivery networks, and using Health Information Exchanges (HIEs) to share information. Some large pharmaceutical companies are de-identifying data from their clinical trials, protecting patients’ privacy while making data available to qualified researchers outside the organization. Legislation in some states has spurred the creation of the All-Payer Claims Database (APCD), which requires all the payers in the state to put claims data into a central repository that can be used to better understand costs, quality, and outcomes.
As all these changes suggest, the era of big data has arrived in healthcare. The volume, variety, and velocity of healthcare data are increasing; organizations are collecting more data from a wider variety of sources at greater speed every day. Data sources range from the traditional (including EHRs, medical images, and real-time data from monitoring devices) to nontraditional (such as patient or plan member feedback from social media). To maximize the value of all this data, organizations must adopt new approaches and deploy solutions that can help deliver meaningful insights at the moment when they can have the greatest impact. Generating New Insights from Big Data The new health information landscape gives organizations unprecedented opportunities—if they can successfully apply analytics to big data. In addition to improving health outcomes and reducing costs, payers and providers can use new insights to better market products and enhance the experience of patients and plan members. New insights can also help organizations more effectively communicate with healthcare consumers and encourage healthier lifestyles.[294]
Time4
To realize these benefits, however, new approaches and technologies are required. Organizations need new analytics solutions and robust infrastructures that can handle the volume, variety, and velocity of big data and generate results rapidly.
Healthcare organizations are collecting more data, and they intend to analyze data more comprehensively than before. Instead of extrapolating insights from a small sample of claims data, a payer might want to examine all two million rows of data available. Analyzing larger collections of data improves accuracy of results and can help users find unexpected patterns and insights.
Processing large data volumes requires hardware that can deliver outstanding performance. Fortunately, new generations of industry-standard multi-core processors can provide the performance required—often at a much lower cost than yesterday’s big, expensive proprietary systems.
Using industry-standard servers also helps organizations achieve cost-effective scalability. In the past, organizations traditionally scaled up to accommodate growing data volumes by ripping and replacing one server with a single bigger server. This approach resulted in higher capital and operating costs. Today’s industry-standard servers allow scaling horizontally, so organizations can add systems using smaller, open platforms that are less costly to purchase and maintain
Adopting the right networking and storage solutions is also essential for managing large data volumes and delivering rapid results cost-effectively. Organizations need fast, high-throughput connectivity solutions to reduce data bottlenecks, plus storage solutions that can balance performance, capacity, and cost. Of course, to run queries, conduct complex analyses, and rapidly generate new insights, organizations need high-performance analytics software. That software must be flexible and scalable enough to support ongoing analytic endeavors, no matter how big the data or how complex the analysis needs become. Software that enables distributed processing options—including in-memory, in-database, and grid computing models—can help health organizations take advantage of the latest technology advances while providing the scalability for growth and the flexibility for change.[310]
Time5
Variety
To capitalize on the wide variety of data available, organizations need software solutions that can help them capture, integrate, and analyze unstructured data, such as the clinicians’ notes buried in electronic health records. Master data management solutions can help meet an organization’s requirements for integrating data from multiple sources and help ensure the data is reliable.
Healthcare organizations should look for comprehensive data management solutions that enable them to:
• Access critical data independent of systems and platforms
• Produce accurate, correct, and consistent information from all sources
• Manage data governance initiatives to conform with compliance and business policies
• Integrate data in a graphical environment, orchestrate processing, and enable users and organizations to collaborate with other entities
• Centrally manage data from a single, easy-to-use graphical interface console Velocity Analytics software and the infrastructure on which it runs must also be able to capture and analyze high-velocity data, and deliver timely results. Patient monitoring systems such as those used in ICUs generate critical data at a rapid pace. If organizations can produce insights from that data in real time or near-real time, they can provide those insights to patient care teams at the moments when interventions will have the greatest benefits
Meeting these needs requires moving from a batch processing model to near-real-time processing and reporting. Traditional, batch-oriented approaches are designed to process data on a nightly or weekly basis. To enable faster decision making, organizations need software that can capture data as it comes in and analyze it in close to real time. If healthcare providers can quickly explore high-velocity data, they can promptly identify potential problems and take immediate steps to address them. To help users quickly make sense of high-velocity data, organizations need analytics solutions that incorporate visualization capabilities. Visualization can help users identify patterns, make correlations among disparate data types, and explore data much more quickly and easily than when viewing a spreadsheet. Visual analytics tools that use familiar drag-and-drop functionality and enable access through mobile devices help bring analytics to a wide range of business users. Analytics no longer has to be relegated to back-office specialists.[349]
Time6
Getting Started
Analyzing big data holds tremendous promise for healthcare providers, payers, and patients. But how should a healthcare organization get started with big data?
1 . Work with business units to articulate opportunities: Capitalizing on big data opportunities requires an end-to-end strategy in which IT groups are the technical enablers but key executives, business groups, and other stakeholders help set objectives, identify critical success factors, and make relevant decisions. Together, these groups should consider existing problems that have been difficult to address, as well as problems that have never been addressed before because data sources are new or unstructured.
2 . Get up-to-speed on technology: IT groups must solicit information from peers and vendors to identify the best software and hardware solutions for analyzing big data in a healthcare context.3 . Develop use cases: Defining and developing use cases will help organizations focus on the right solutions and create the best strategies. As part of this process, IT groups should map out data flows, decide what data to include and what to leave out, determine how different pieces of information relate to one another, identify the business rules that apply to data, consider which use cases require real-time results and which do not, and define the analytical queries and algorithms required to generate the desired outputs.[217]
|
|