Top Ten Tech Trends 2018: Could Newfangled Business and Technology Combinations Upend the Landscape of Healthcare Delivery? – Healthcare Informatics

Editor’s Note: Throughout the next week, in our annual Top Ten Tech Trends package, we will share with you, our readers, stories on how we gauge the U.S. healthcare system’s forward evolution into the future.

In an article online in the Business Insiderpublished in July, Shelagh Dolan put it very directly: “Healthcare disruption is no longer looming—it’s here, and it’s necessary. As the population continues aging, health organizations and providers are struggling to keep up with growing demand for care, while consumers are facing astronomical costs—often for subpar service.”

What’s more, Dolan wrote, “Desperate for ways to cut costs and improve accessibility of patient care, traditional healthcare providers are turning to tech innovations for help. In the last five years alone, healthcare funding among the 10 largest US tech companies jumped from $277 million to $2.7 billion. New technologies including telehealth, wearables, mobile apps, and AI are facilitating a shift towards preventative medicine and value-based care, in turn reducing claims, improving benefits plans, lowering patients’ premiums, and increasing their lifetime value.”

What are these companies up to? Alphabet, Google’s parent company, is leveraging its extensive cloud platform and data analytics capabilities to hone in on trends in population health, the Business Insider report noted. The company plans to drive more strategic health system partnerships by identifying issues with electronic health record (EHR) interoperability and currently limited computing infrastructure. Meanwhile, Amazon’s recent acquisition of PillPack is another step towards integrating medical supplies and pharmaceuticals into its platform and distribution. The company is also ramping up its AI capabilities to transform Alexa into an in-home health concierge, thereby driving consumers to the website for prescriptions and basic medical supplies. Apple and Microsoft are moving in, too, with Apple looking to turn its popular consumer products into powerful healthcare tools — with monitoring capabilities that benefit both patients and providers, and Microsoft, like Alphabet, leveraging its robust data and analytics capabilities for visibility into population health.

As a June 15 report in the online publication EHR Intelligence noted, the research firm Kalorama Intelligence reports that those three companies—Google, Apple, and Microsoft—have filed more than 300 healthcare patents between 2013-2017—among them, Google’s 186 patents, mainly focused on investments for DeepMind, its artificial intelligence and Verily, its healthcare and disease research entity; Apple’s 54 patents focused on turning its iPhone into a medical device that can monitor biometric data such as blood pressure and body fat levels and to develop algorithms to predict abnormal heart rates; and Microsoft’s 73 patents, centered around expanding its artificial intelligence capabilities and developing monitoring devices for chronic diseases.

That EHR Intelligence article also stated this: “Although Amazon has not officially announced details, industry rumblings indicate that Amazon has been working with a secret project team that is exploring platforms for EMR data, health apps and telemedicine. This secret team is known as 1492 and is working on extracting data from EMRs to make it more useful to healthcare providers. Reportedly, the team is also working on building a telemedicine arm, using technology to make virtual doctor/patient consultations a reality.”

Will New Business Combinations Upend the Healthcare Provider Landscape?

Meanwhile, new business combinations are hitting healthcare, too. The announcement on Dec. 3, 2017, that mega-pharmacy retailer CVS was planning to acquire health insurer Aetna for $69 billion, threatened to reshape the landscape of healthcare delivery and organization in the U.S. And, just seven weeks later, three corporate giants—Amazon, Berkshire Hathaway, and JPMorgan Chase & Co.—announced on Jan. 30, in an ambitious-sounding, if vaguely worded, announcement, that they were launching an initiative to improve satisfaction and reduce costs for their companies’ employees.

And all of these moves to disrupt healthcare are connected, fundamentally, to the high, and growing cost, of the U.S. healthcare system, which Medicare actuaries expect to reach $5.6 trillion, and 19.9 percent of gross domestic product, by 2025. As Joseph Walker noted in a July 31 article in The Wall Street Journal entitled “Why Americans Spend So Much on Health Care—In 12 Charts,” “Americans aren’t buying more healthcare overall than other countries. But what they are buying is increasingly expensive. Among the reasons is the troubling fact that few people in health care, from consumers to doctors to hospitals to insurers, know the true cost of what they are buying and selling. Providers, manufacturers and middlemen operate in an opaque market that can mask their role and their cut of the revenue. Mergers give some players more heft to enlarge their piece of the pie. Consumers, meanwhile, buoyed by insurance and tax breaks, have little idea how much they are spending and little incentive to know underlying costs.”

So what will the impact be of all of these bold technological and business moves, for established patient care organizations in U.S. healthcare?

With regard to the planned CVS-Aetna merger, Tim Zoph, client executive and strategist at the Naperville, Ill.-based Impact Advisors, Inc., and a former healthcare CIO, says, “This is a vertical play. Part of this is just the need to really manage the pharmaceutical supply chain and costs. Medications and drugs still represent the largest and fastest-growing costs in healthcare. So the idea that you can insure and also be close to the management and distribution of those medications, sort of brings the insurance risk of managing the PBM (pharmacy benefit manager) side, to the fore. And Aetna is really seeing an opportunity for CVS to reinvigorate beyond its current business model. There may be an opportunity beyond pharmacy management, to look at other retail services, in retail settings. Care is really the growth market; it’s lower-cost, distributed, and convenient. The idea that you have a retail venue that starts with pharmacy is a strong vertical play.”

Tim Zoph

Meanwhile, Steve Valentine, vice president of strategic advisory services at the Charlotte-based Premier Inc., says, “Today [Aug. 3], the California Attorney General opposed the CVS-Aetna merger over cost concerns. We would call that a vertical merger. They have the Minute Clinics, the PBM, etc.—we see them invading the healthcare space to compete for what we call ‘stickiness with the consumer.’ You have 9 to11 percent of the spend in pharmaceuticals. It will be interesting to see whether they go down a path like United, which has Optum, and which is acquiring medical groups.”

And, says Valentine’s colleague Shawn Griffin, M.D., vice president, clinical performance and applied analytics, at Premier, “You’re seeing continued attempts to find ways to put together groups that can help you save on healthcare spend. You’re seeing that with innovations, and with partnerships of all kinds, and they’re all trying to find out what the right team is, to become more efficient and improve quality. You’ve seen it with the EMR vendors, too.”

Don Crane, president and CEO of the Los Angeles-based America’s Physician Groups (APG), a nationwide association of physician groups involved in risk-based contracting, says of all the new combinations, both the technological and business ones, “To me, they signal a very restive employer world, a restive and dissatisfied employer world, certainly, when you talk about Google and Amazon, and so, too, with the carrier-PBM combination. There, it’s more about the players looking for a new model, implying that there’s a dissatisfaction to the point of abandonment of faith in the existing model. So, has the inefficiency of our current healthcare delivery system now produced pain at such a high level that it’s no longer about academic conversations, but time for a variety of different actions? That’s what it’s telling me, that we’re about to hit a pain point. Healthcare is using up more and more of our GPD, and really is hitting our global competitiveness now. So yes, this is very significant.”

Don Crane

Of course, the fundamental practical question is, should the leaders of hospitals, physician groups, and health systems worry about all these new disruptive developments? “Yes, I think so,” says Impact Advisors’ Zoph. “There are two trends to think about here,” he says, speaking of the pending CVS-Aetna merger. “One is the distributed-care, remote to retail, focus on the consumer, follow the higher margin, business; and the other is the issue of consolidation among providers. We’re seeing a growing consolidation of physicians, and of hospitals. That deal gives them greater control and greater market share; evidence shows that… there will be tension because of the consolidation and shoring up of the traditional healthcare market at the same time as these new entries.”

Meanwhile, APG’s Crane notes that “I don’t think that the architects of these various transactions see them all in the same way. They have slightly different strategies, and are facing different challenges. For example, he notes, referring to the CVS-Aetna deal, “The minute-clinic concept never really took off; it didn’t exactly work. But what’s different about these diagonal mergers? I think some of it lies in the data—you’ll be combining the data of a health plan with a pharmacy with a PBM. And we’re moving into an era of artificial intelligence and machine learning and the ability to stack up algorithms to the nth degree and know things we didn’t know before.”

What’s more, Crane says, the leaders of patient care organizations need to begin to rethink the organization and delivery of primary care, as new models come into being. “There’s also the factor of the idea of the transformation of primary care,” he says, speaking of the business and technology disruptors. “I think they envision a world where you don’t have to call your doctor six weeks in advance, drive through traffic, wait for hours, wait for days to get your results—and that just doesn’t seem cool in the second decade of the 21st century. It’s a model begging for revolution.”

Top Ten Tech Trends 2018: Could A New Interoperability Wrinkle Solve Healthcare’s Biggest Challenge? – Healthcare Informatics

There has been much debate on the possibility of mandating providers to participate in health information exchange activities to stay in Medicare

Editor’s Note: Throughout the next week, in our annual Top Ten Tech Trends package, we will share with you, our readers, stories on how we gauge the U.S. healthcare system’s forward evolution into the future.

For a few years now—from the latter part of the Obama administration through the first 18 months of Trump’s presidency—federal health officials have been adamant about moving from an era of EHR (electronic health record) adoption to one in which these technology systems will make it easier for providers and patients to share health data across the care continuum.

Of course, healthcare interoperability has been a great pain point to date with one of the primary barriers being the lack of a true business incentive to compel providers and EHR developers to be “open” with this ever-so-important data. To this end, in recent proposed regulations, federal health leaders have clamped down, perhaps harder than ever before, in their ongoing effort to guide stakeholders to a world in which seamless health data exchange is the norm, rather than a rarity.

There may have been no greater evidence of this than the Centers for Medicare & Medicaid Services’ (CMS) proposed updates to Medicare payment policies and rates under the Inpatient Prospective Payment System (IPPS) and the Long-Term Care Hospital (LTCH) Prospective Payment System (PPS) in April. In the rule, CMS proposed to re-name the federal EHR Incentive Program, or meaningful use, choosing to now call the program “Promoting Interoperability.” But just how far the federal agency will go beyond “promotion” is what’s particularly fascinating.

Webinar

Advancements in Healthcare: Interoperability, Data Exchange, and More

Micky Tripathi, President and Chief Executive Officer of the Massachusetts eHealth Collaborative, is one of the most well-informed and well-respected healthcare IT leaders in the U.S. Tripathi has…

CMS wrote that it will be seeking public comment, via an RFI (request for information) on whether participation in the government’s Trusted Exchange Framework and Common Agreement (TEFCA) initiative—a federally-constructed plan released in January to jolt interoperability among providers—could be used as a vehicle to mandate providers to share data.

Indeed, more specifically, in the proposed regulation, CMS said it will be soliciting feedback on if the agency should revise its “Conditions of Participation” for hospitals that would require them to perform health information exchange activities such as: electronically transferring medically necessary information upon a patient discharge, sending discharge information to a community provider via electronic means; and making information available to patients or third-party apps. And if providers did not meet these Conditions of Participation, if they were to be revised, the consequence would be that they would not be able to participate in Medicare.

How Are Stakeholders Reacting?

As one might imagine, seeking comment on whether or not interoperability should be a requirement for Medicare participation has elicited a wide array of responses across the sector. Comments on the RFI were due to CMS by the end of June, and there was no update from the agency when it published the final Promoting Interoperability rule in early August on what the next steps might be. But as one health IT expert, Jeff Smith, vice president of public policy at the Bethesda, Md.-based AMIA (the American Medical Informatics Association), points out, it could be some time before there are actual interoperability mandates.

“You have to keep in mind the realities of the process. What you need to remember is that the Conditions of Participation aspect of the [proposed rule] was specifically called out as an RFI. And the RFI tries to get information that the agency could consider for potential future rulemakings,” explains Smith.

In other words, CMS did not include the potential Conditions of Participation revision as part of the proposal; rather, the RFI is the step leading up to a potential proposal. And “potential” is the key word to keep in mind, Smith says.

Jeff Smith

Nevertheless, health IT trade groups were keen to give their feedback on the RFI in their public comments. While some stakeholders were adamantly against the possibility of revising Conditions of Participations to revive interoperability, others were strongly in favor.

Kelly Hoover Thompson, CEO of SHIEC (the Strategic Health Information Exchange Collaborative), a national collaborative that represents health information exchanges (HIEs) and their partners, believes that imposing regulatory requirements is not the best solution to the interoperability problem. “I don’t think it has to go as far as what CMS [could be proposing], but I do think that it’s an indication to the industry, on a good level, that so many people are [thinking about] how to make things better,” says Hoover Thompson. Other major industry trade groups, such as AMIA, the American Hospital Association (AHA) and the College of Healthcare Information Management Executives (CHIME), similarly attested in their comments that CMS is taking the wrong approach.

Kelly Hoover Thompson

That said, others feel differently. A letter signed by more than 50 organizations, representing plans, providers, patient groups, ACOs (accountable care organizations) and health IT companies, has called on CMS to take more aggressive action to promote interoperability and advance health information exchange. Some of these signed groups include prominent industry names such as Beth Israel Deaconess Care Organization, Blue Shield of California, the New York eHealth Collaborative (NYeC), and Aledade.

Valerie Grey, director of one of the signing organizations of the letter—the New York eHealth Collaborative—says that NYeC supports potentially changing the Conditions of Participation with an incremental step. Grey says that more vendors and providers have been willing to make data available for shared patients, but further progress is necessary.

More broadly, though, Grey points out that not every hospital can afford the EHRs that are enabling some of that interoperability today. However, being that NYeC’s role is to promote and enable information exchange, it is natural for the organization to support efforts that seek to promote information sharing, she says.

Will Business Incentives Change?

In the end, just as the previous administration changed incentives to encourage EHR adoption, the current one will have to find ways to inspire the next stage of interoperability: the efficient movement of health data. But it’s a delicate balance and it becomes a question of where to apply the pressure and how to incentivize the kind of behavior that is desired, AMIA’s Smith says.

Another piece of the puzzle is an information blocking rule from the Office of the National Coordinator for Health IT (ONC), which was originally expected in April, but now has been delayed until September [and has not been published at the time of this story]. It’s this rule, coupled with TEFCA, that experts believe could move the needle on incentivizing providers and EHR developers to interoperate.

“The hope was that the information blocking rule would be the definitive answer to the business incentive piece to the interoperability puzzle,” says Smith, noting that there are plenty of questions that need to be answered even within that rule itself. For instance, he contends, “data availability—when it comes to matters of life and death are issues of public health. But data availability for non-matters of life or death, or matters of an individual, comes down to more business questions—such as do I need to share the data for everyone who asks for it?”

Regarding TEFCA, officials are likely to put out a second draft version and another public comment period will follow, says John Kansky, the CEO of the Indiana Health Information Exchange, and appointed member of the Health Information Technology Advisory Committee (HITAC), which has had a lot of influence in making TEFCA recommendations to ONC. “It’s fun to think about and I feel privileged in having a front row seat in helping advise ONC, but we don’t know how the regulation will evolve,” Kansky admits.

That said, it’s not helpful or informative that opinions on TEFCA are all over the board, he adds. There are two buckets of folks that are thinking about and opining on this regulation, Kansky says: those who are strongly in the camp of advocating for consumer access and being in control of their data; and those who have been working on creating industry interoperability, be it through EHR vendors or HIEs, and trying to make it a reality. “Those two groups tend to have different views on regulation,” he points out.

John Kansky

Going forward, Smith asserts that the key will be how these policies that the industry has been expecting for some time now will intertwine with one another. “What I am hoping beyond hope is that the reason these policies are not yet publicly available is because the powers that be are still trying to make sure that there is harmonization and a logical inter-reliance across them,” he says.

Meanwhile, Kansky believes that the interoperability space is “ripe for bold change and opportunity. ”A few years into the future, he predicts, between the EHR interoperability approaches that will be prevalent, in addition to HIEs and regulations like TEFCA, the ecosystem will find a way for everything to work together. “Even when one sees the other as competition or making the way of life more challenging, this is what we do in the free market in the U.S.,” he says. “We throw a bunch of different approaches at a problem and sometimes we make a mess, but in the end we figure out a solution. And the solution isn’t one thing, pure and simple; oftentimes it’s a complicated ecosystem—but it works.”

Dell Technologies Announces New IoT Solutions to Automate Powerful, Actionable Insights – PR Newswire (press release)

LAS VEGAS, Aug. 28, 2018 /PRNewswire/ — VMWORLD

News summary

  • New solutions include hardware and software engineered to work together to support computer vision and machine intelligence from edge to cloud
  • New Dell Technologies IoT Solution for Surveillance speeds time to realize return on investment with flexibility and customization
  • New IoT Connected Bundles, created through the Dell Technologies IoT Solutions Partner Program, offers channel partners new revenue streams

Full story

Dell Technologies‘ Edge and IoT Solutions Division is announcing new solutions and bundles to simplify deployment of secure, scalable solutions for edge computing and IoT use cases. With these solutions, Dell Technologies is combining tools from its broad portfolio with technology from Intel and partners in the Dell Technologies IoT Solutions Partner Program. This will drive workloads for computer vision – enabled by imaging sensors – and machine intelligence – characterized by structured telemetry from sensors and control systems. Dell Technologies has collaborated with Intel, who has helped advance these solutions with their computer vision and analytics technologies.

“Workloads and use cases for computer vision and machine intelligence require different combinations of tools, but the computing infrastructure elements are the same,” said Joyce Mullen, president, Global Channel, OEM and IOT Solutions at Dell Technologies. “Dell Technologies provides a scalable, secure, manageable and open infrastructure – spanning edge to cloud – so customers and partners can realize value today and build a foundation to support these workloads and case studies in the future.”

Cameras provide rich information about the physical world, but the deluge of video data creates too much data for humans to cost-effectively monitor for real-time decision making. Applying analytics, such as Artificial Intelligence, to these data streams automates powerful, actionable insights. Events driven by computer vision and analyzed together with telemetry from machines – including data that imaging sensors cannot provide, such as voltage, current and pressure – results in even more powerful insights.

Because of rapid evolution in the IoT market and the fast pace of innovation required to stay competitive, many customers now require an ‘IT grade’ infrastructure. This should include a security-first perspective to support scalability, data diversity and the increasingly complex needs of connected devices.

Integrating Dell Technologies solutions to enable computer vision

By enabling computer vision with Dell Technologies IoT solutions, customers can more accurately, efficiently and effectively “see” relevant information pertaining to areas such as public safety, customer experience, and product inventory and quality. Surveillance is the first use case to which Dell Technologies has applied computer vision, so customers can more cost-effectively monitor events in the physical world and automate decision‐making.

The IoT Solution for Surveillance is specifically built to transform and simplify how surveillance technology is delivered with an easy to deploy and manage hyper-converged, software-defined solution. Available later this year to purchase as a package, the engineered, pre-integrated solution will provide a consistent foundation from edge to distributed core to cloud. It will also be ready to run on day one with customer data to speed the return on investment. The solution is currently available as a reference architecture to align systems and build a framework for computer vision learning and adoption for other use cases.

Built on world’s leading cloud infrastructure, the Dell Technologies IoT Solution for Surveillance will offer:

  • Rapid return on investment – The open architecture allows for customization with immense choice of technology, purchasing and configuration. Additionally, it is flexible to adapt to changing technologies or industry regulations to meet data needs today and in the future.
  • Reduced riskDell EMC Surveillance Validation Lab tests hardware and ISV software solutions together in extreme simulations to ensure it will stand up in the real world.
  • Integrated security – The built-in security measures include micro-segmentation (NSX-T) and the ability to push over-the-air (OTA) updates and security patches in real time to all surveillance devices from camera to cloud. It also will include holistic management capabilities across IT and operational technology (OT) concerns through a combination of VMware Pulse IoT Center and Software Defined Data Center.
  • Greater reliability with scalability from hundreds to thousands of cameras/sensors – The solution offers an automated, fault-tolerant approach to scaling from 300TB to 50PB+ on Dell EMC Elastic Cloud Storage for private, on-premise, off-premise or hybrid needs. High availability and zero data loss will be guaranteed with vSAN RAID 5/6 across flash and disk. Additionally, ESXi Enterprise Plus enables high availability and disaster recovery services.

Dell Technologies IoT Solutions Partner Program creates new revenue opportunities for channel partners

The Dell Technologies IoT Solutions Partner Program is an award-winning, multi-tiered program supporting technology and services partners from the edge through the full Dell Technologies solution stack. Through this program, Dell Technologies has identified several partners demonstrating strong use case focus and clear return on investment to create the new Dell Technologies IoT Connected Bundles. The bundles include sensors and licensed software from partners tailored for specific customer use cases, together with various combinations of Dell Technologies infrastructure spanning edge gateway, embedded PC and server hardware. This is in addition to complimentary software like VMware Pulse IoT Center for securing, managing and monitoring these solutions at scale.

Sold fully through the channel, IoT Connected Bundles are validated, market-proven solutions that channel partners can deliver to their customers as turnkey offerings. With these solution bundles, channel partners have a new value proposition to offer their customers, as well as a new potential revenue stream. With a clear, repeatable and accelerated path to return on investment, channel partners can enter the IoT market knowing the solution will work and deliver predictable results fast.

The IoT Connected Bundles include:

  • ELM: compliance-aaS for HVAC, refrigeration and power systems
  • V5 Systems: self-contained and powered surveillance for safety and security in outdoor spaces
  • IMS Evolve: energy savings for grocery retailers while improving food quality and safety
  • Modius: advanced Data Center Infrastructure Management (DCIM)
  • Pelco: video surveillance tailored for the requirements of K-12 education
  • Pixel Velocity: efficient remote monitoring of field assets in oil and gas operations
  • ActionPoint: predictive maintenance in midmarket manufacturing
  • Software AG: digital manufacturing intelligence suite for larger-scale operations

Over time, additional bundles will be available through further curation in the Dell Technologies IoT Solutions Partner Program.

Enabling machine intelligence in an open and scalable way

Dell Technologies continues its commitment to openness and standardization in IoT. It actively participates in EdgeX Foundry, a vendor-neutral open source project focusing on building a common interoperability framework to facilitate an ecosystem for edge computing. The project, which has grown to more than 60 member organizations, recently announced the ‘California‘ code release, a major step in evolving the EdgeX framework to support developer requirements for deployment in business-critical IoT applications. Dell Technologies Capital recently invested in IOTech Systems, a startup offering a commercially packaged version of this framework that enables customers to focus on their preferred value-add instead of supporting the open source code.

Dell Technologies also participates in the Industrial Internet Consortium (IIC), the OpenFog Consortium and the Automotive Edge Computing Consortium (AECC).

Availability

  • The reference architecture for Dell Technologies IoT Solution for Surveillance is available today, and the engineered solution will be available in October 2018.
  • IoT Connected Bundles for channel partners will be available in September 2018.

Customer quotes


Dr. Lou Marciani, Director, National Center for Spectator Sports Safety and Security (NCS4)


“Dell Technologies has built a trusted practice for IoT and Surveillance to support our forward-thinking 2025 initiative, to bring better safety to millions of people a year who attend sporting events at large venues. The new IoT Solution for Surveillance is designed specifically to reduce the complexity of building, scaling and managing these complex environments, and we’re excited to see how much this will help make our venues safer.”

Jeff Burgess, President/CEO, BCDVideo


“We’re excited to be one of the first validated partners to leverage Dell Technologies’ new IoT Solution for Surveillance. BCDVideo has always been the ‘easy button’ for our security integrator customers. We continue to take the complexity out of buying, building and managing complex surveillance solutions from the edge to the cloud. Partnering with Dell Technologies on this newest initiative allows us to jointly deliver an enterprise-class, end-to-end platform that is lab tested and validated to ensure stronger security, networking and IT/OT management across the variety of software and hardware layers in the solution. BCDVideo and Dell Technologies share a vision of constant innovation, all the while protecting both the public and the infrastructure.”

Analyst quote


Carrie MacGillivray, Group Vice President, IoT and Mobility, IDC


“Organizations are looking to integrated IoT solutions that bring together the storage, security, network and management and orchestration. Companies need to find a partner that understands these requirements and can help provide the piece parts to build out a holistic solution. Dell Technologies’ holistic portfolio of key IoT solutions and go-to-market options make them a solid partner for your IoT journey.”

Partner quotes

Jonathan Ballon, vice president, Internet of Things Group at, Intel


Imaging and video use cases create extraordinary amounts of data and need an intelligent vision solution to rapidly analyze data near the edge, respond in near real time, and move relevant insights to the cloud. The new Dell Technologies IoT Solution powered by Intel® Vision Products including Intel® Xeon® Scalable processors and the OpenVINO™ toolkit, provides the performance, efficiency and open compute platform necessary to speed deployment of computer vision solutions to transform vision data into valuable business insights from edge to cloud.”

Craig Smith, vice president, IoT & Analytics Europe, Tech Data. “Tech Data is proud of the IoT solutions we are making available to our customers around the world. These offerings are further testimony of the increasing importance of the solution aggregator role we play in IoT & Analytics. Bringing together the required components from different vendors in the form of solutions that are proven, validated and deployed at scale provides production-ready solutions for our customers and significantly speeds up the time to revenue.”

Additional resources

  • Link to Dell EMC Surveillance page
  • Link to Dell Technologies IoT solutions for surveillance
  • Link to Dell Technologies IoT Connected Bundles
  • Link to Dell Technologies IoT page
  • Direct2DellEMC blog by Jason Shepherd on computer vision and machine intelligence
  • Direct2DellEMC blog by Chris Wolff on IoT Connected bundles
  • Direct2DellEMC blog by Mike McDonough and Ken Mill on IoT Solutions for surveillance
  • For more information on Dell EMC news at VMworld, access the virtual press kit.
  • Connect with Dell EMC via Twitter, Facebook, YouTube, LinkedIn and DECN

About Dell Technologies

Dell Technologies is a unique family of businesses that provides the essential infrastructure for organizations to build their digital future, transform IT and protect their most important asset, information. The company services customers of all sizes across 180 countries – ranging from 99 percent of the Fortune 500 to individual consumers – with the industry’s most comprehensive and innovative portfolio from the edge to the core to the cloud.

Copyright © 2018. All Rights Reserved. Dell, Dell EMC, Dell Technologies and the Dell Technologies logo are trademarks of Dell Inc. in the United States and/or other jurisdictions. All other marks and names mentioned herein may be trademarks of their respective companies.

1IDC WW Quarterly Cloud IT Infrastructure Tracker, Q1 2018, June 2018. Based on vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT.

SOURCE Dell Technologies

Is There Room for Blockchain in Health Care? – The National Law Review

In the tech world, blockchain technology appears to be the panacea to all problems. As blockchain technology becomes increasingly popular, many industries are trying to determine the best way to use the new phenomenon. Healthcare is no different in this quest. Health care is an optimal candidate to benefit from development of innovative ways to solve its impending issues using transformational technology. Blockchain could be the technology that helps to alleviate some of health care’s problems, such as the incredibly fragmented delivery of care and the painstakingly slow reaction to technological advances.

What is Blockchain Technology?

An over-simplified explanation of blockchain is an online database that stores information on a network of computers. Information also known as “a record” is stored in a block. For example, a record of you paying Mr. Smith 10 dollars is stored in a block. Traditionally, that information is saved in a database at a data center. However, blockchain technology stores that record on an individual computer with a time stamp (the “block”). Any change to that information is then stored on another individual computer with a time stamp. Each individual computer holds a block of information that is chronologically time stamped, which creates the blockchain. Thus, information cannot be edited or changed without the verification from all parties who have access to a block in the blockchain. Blockchain technology distributes and decentralizes information. There is no central company or one person that holds the information. This makes it extremely difficult for any one person to take down or corrupt the network. Traditionally, blockchain technology is used as a public transaction ledger for bitcoin. Bitcoin users utilized the technology to mitigate the issue of double spending, spending the same single digital coin more than once, without the need of one trusted authorizer or central server.

Blockchain and Health Care

Blockchain technology could play a role in the industry’s goal to improve the quality of care through care coordination. Care coordination often involves the sharing of information between multiples providers. Blockchain technology could be used to facilitate this process in a more efficient manner by storing a variety of information, including provider and patient details, within electronic health records (EHR) on a network of computers. Blockchain would store the information on various computers, such that information entered into an EHR could be stored across a network of computers that includes providers and the patient. Providers and the patient would hold blocks of information, allowing each provider and each patient to validate the updates to that patient’s record with the consensus of all the providers and the patient. Using blockchain in this fashion would give patients control over their care while also encouraging care coordination because providers would have to interact with one another to update a patient’s file. In this sense, Blockchain could take the first step in facilitatating the improvement of patient care as a whole.

Blockchain could also reduce the health care industry’s susceptibility to privacy attacks or breaches because of its decentralized and distributed structure. Privacy attacks often involve a hacker entering a system or a database, but, with blocks held in multiple locations instead of one database, blockchain technology would help to minimize hacker infiltration.

However, as with any heavily regulated industry, implementing blockchain will not be easy. There are state and legal roadblocks that hinder blockchain’s viability. Health Insurance Portability and Accountability Act (“HIPAA”), for example, could hinder the ability of sharing health information technology between a network of computers due to restrictions on sharing of Personal Health Information (PHI). Furthermore, state and federal laws would have to be updated to facilitate this technological advance. Despite these hurdles, there may be a glimmer of hope. The Centers for Medicare & Medicaid Services is dedicated to improving interoperability and patients’ access to health information through its Promoting Interoperability program. The agency’s push for moving health towards EHR has the potential to be pivotal if the industry uses blockchain or a similar technology to improve patient access to health information.

Blockchain may not be a today solution—it will take time to change state and federal laws regarding health information to facilitate such technology. However the promotion of initiatives encouraging use of EHR, may be priming the industry’s palate to provide a place for blockchain in the future.

10 surprising uses for analytics in healthcare – Health Data Management

Palmetto Primary Care Physicians worked with Optum Analytics to use artificial intelligence and machine learning predictive models to conduct risk stratification analyses to identity those who “ride the line on pre-diabetes and are on their way to being introduced to diabetes,” says Terry Cunningham, CEO at Palmetto. The result was improved care and savings of $4 million across Palmetto’s accountable care organization contracts.

See the full story here.

Experts examine Asia’s approach to cybersecurity – Brookings Institution

Security challenges in Asia come not only from nuclear threats or geopolitical conflicts, but also increasingly from the cyber space. As Jung Pak, senior fellow in the Brookings Center for East Asia Policy Studies and SK-Korea Foundation Chair in Korea studies said during a June conference at Brookings on cybersecurity in Asia: “Cyber is a threat we can’t see, that cross borders, and is one of the tools of coercive diplomacy.”

Author

Paul Park

Senior Research Assistant – Center for East Asia Policy Studies

As experts discussed at the event, compared to traditional security threats, cyber threats are harder to detect or attribute, more often transnational or even transcontinental, and can be very disruptive at a relatively low cost to perpetrators. Cybersecurity has become especially important for Asia, which is home to a significant number of cyber perpetrators and their targets. Below are some key takeaways from the conference, during which a range of U.S. and South Korean cybersecurity experts discussed the capabilities and intentions of regional actors and examined government policies to counter evolving threats.

The cyber landscape in Asia

In a keynote address, former Coordinator for Cyber Issues at the U.S. Department of State Chris Painter argued that we should perceive cybersecurity as both a threat and an opportunity for Asia. He explained that regional actors like China and North Korea have frequently exercised their cyber power to achieve their strategic goals around the globe. Yet their motivations and objectives differ: While North Korea primarily aims to develop capabilities for revenue generation and destructive capabilities for potential conflicts outside North Korea, China mainly utilizes its cyber means for espionage and intellectual property theft. “Naming and shaming” has been an effective tool against China because of its government’s concerns on the potential blowback on its soft power, Painter said.

Painter also said that the rest of the region is starting to recognize vulnerabilities in its own defense systems and striving to catch up. This growing awareness of cyberattacks has prompted countries like Singapore, Japan, and South Korea to increase their investments in cyber capabilities in recent years. Most notably, Singapore—which was once thought to be one of the most vulnerable countries in the world to cyber threats—has become a regional leader and has assisted other ASEAN countries in developing their cyber capacities.

North Korea and China

Sangmyung Choi of Hauri Inc., a Seoul-based cybersecurity firm, outlined the spectrum of North Korea’s cyberattacks toward South Korea. In sum, North Korea has hacked or attempted to hack almost every well-known industry, institution, government agency, and large corporation in South Korea. It is also capable of simultaneously affecting a high volume of systems. According to Choi, North Korean hackers in 2009 infiltrated about 400,000 computers in South Korea through a distributed denial of service (DDoS) attack, using approximately 2,000 servers around the world. He detailed North Korea’s method of starting its attack from a subsidiary of the main target. For example, when North Koreans extracted nuclear reactor designs from a South Korean nuclear power plant, they were able to infiltrate the plant’s internal intranet by first hacking into one of the plant’s partner companies. Choi also noted that North Korean hackers tailor their attacks by identifying and taking advantage of certain South Korean vulnerabilities, such as the 23 known vulnerabilities found in the widely used Korean word processor, Hangul Office.

On Chinese cyber capabilities, William Carter, deputy director of the Technology Policy Program at the Center for Strategic and International Studies (CSIS), discussed Beijing’s move toward professionalization and civil-military fusion in cyber arenas. Unlike Russia, for instance, China largely limits its projection of cyber power to propaganda, intellectual property theft, and intelligence gathering. Building on Painter’s assessment of China’s concern for maintaining its soft power, Carter also stated that China prefers to be seen playing the role of the “good guy.” But for China to have its cake and eat it too, it must be more subtle in its operations and focus on strategic goals. He views the professionalization of China’s capabilities as another way to delegitimize the United States: Beijing not only wants to gain influence in the region, but to raise its standing in the international mechanisms that control the cyber sphere. In its push toward professionalization, China is increasingly consolidating its private-sector capabilities with its military intelligence services and focusing on long-term strategic goals, rather than disruptive attacks.

Priscilla Moriuchi, director of strategic threat development at Recorded Future, views the internet as another domain for North Korea’s criminal ventures, primarily in generating illicit funds for the Kim Jong-un regime. She detailed how North Korea commits cybercrimes through hacking banking operations, cryptocurrency thefts in operations and mining, as well as other low-level financial crimes. She raised specific concerns regarding hacks against banking operations because domestic intrabank transfer systems tend to be “variously secured.” It is especially difficult to attribute these attacks since banks are not typically transparent and have no reason to publicize the incidents.

In the case of North Korea’s WannaCry attack in May 2017 and its use of cryptocurrency as ransom payments, many observers initially perceived North Korea’s tactics as naïve, but Moriuchi explained how this was actually not the case:

Cybersecurity policies

Experts also explored government policies to address cybersecurity concerns, starting with South Korea’s cyber defense measures. Professor Jong-in Lim of Korea University, who was the former Blue House special advisor on cybersecurity, raised the ineffectiveness of South Korea’s policies and measures in deterring North Korea. He said there is a lack of information sharing across government institutions. Without proper institutional and legal structures, the various government agencies—including the National Intelligence Service and Ministry of Defense—are reluctant to share information on cyber issues as each vies for influence. In light of improved relations between the two Koreas, Lim suggested that a possible peace treaty should also include an agreement on cyber issues. Such an agreement, he envisions, would incorporate a punishment mechanism to be developed in conjunction with China, Russia, and the United States to ensure that North Korea would unequivocally cease its illicit cyber activities toward South Korea.

James Baker, visiting fellow in the Brookings Governance Studies program, moderated the conversation that followed, asking the panelists for their views on the common challenges that countries face in cyberspace. According to Katherine Charlet from the Carnegie Endowment for International Peace, the lack of an incentive structure, time, money, and the sheer complexity of the issue all contribute to the difficulty in developing a robust cyber defense system. Adding to this difficulty is the fact that cyber actors have lately been more disruptive in their operations while further testing the limits of the international community. Despite this increased boldness in testing the waters, she acknowledged that cyber actors refrain from launching a globally destabilizing threat, which is also partly influenced by governments’ growing willingness to publicly attribute an attack to a specific group or country:

Michael Sulmeyer, director of the Cybersecurity Project at Harvard University’s Belfer Center, expressed his concern about the serious consequences that states could face when offensive cyber capabilities are combined with new technologies. As more states are able to acquire offensive capabilities with relative ease, there is the risk of crossing a new threshold when these capabilities are executed through the “internet of bodies,” which not only include smartphones, but implants in our bodies. He believes that more than the complexity and sophistication of offensive capabilities, the core challenge lies in the negligence and slowness of our defensive posture. In that regard, he urged increased accountability from government and the private sector. On the question of whether cyber deterrence exists, Sulmeyer strongly asserted that it does not at the moment:

For more on this event, click here.

Why the Future of Data Storage is (Still) Magnetic Tape – IEEE Spectrum

It should come as no surprise that recent advances in big-data analytics and artificial intelligence have created strong incentives for enterprises to amass information about every measurable aspect of their businesses. And financial regulations now require organizations to keep records for much longer periods than they had to in the past. So companies and institutions of all stripes are holding onto more and more.

Studies show [PDF] that the amount of data being recorded is increasing at 30 to 40 percent per year. At the same time, the capacity of modern hard drives, which are used to store most of this, is increasing at less than half that rate. Fortunately, much of this information doesn’t need to be accessed instantly. And for such things, magnetic tape is the perfect solution.

Seriously? Tape? The very idea may evoke images of reels rotating fitfully next to a bulky mainframe in an old movie like Desk Set or Dr. Strangelove. So, a quick reality check: Tape has never gone away!

Indeed, much of the world’s data is still kept on tape, including data for basic science, such as particle physics and radio astronomy, human heritage and national archives, major motion pictures, banking, insurance, oil exploration, and more. There is even a cadre of people (including me, trained in materials science, engineering, or physics) whose job it is to keep improving tape storage.

Tape has been around for a long while, yes, but the technology hasn’t been frozen in time. Quite the contrary. Like the hard disk and the transistor, magnetic tape has advanced enormously over the decades.

The first commercial digital-tape storage system, IBM’s Model 726, could store about 1.1 megabytes on one reel of tape. Today, a modern tape cartridge can hold 15 terabytes. And a single robotic tape library can contain up to 278 petabytes of data. Storing that much data on compact discs would require more than 397 million of them, which if stacked would form a tower more than 476 kilometers high.

It’s true that tape doesn’t offer the fast access speeds of hard disks or semiconductor memories. Still, the medium’s advantages are many. To begin with, tape storage is more energy efficient: Once all the data has been recorded, a tape cartridge simply sits quietly in a slot in a robotic library and doesn’t consume any power at all. Tape is also exceedingly reliable, with error rates that are four to five orders of magnitude lower than those of hard drives. And tape is very secure, with built-in, on-the-fly encryption and additional security provided by the nature of the medium itself. After all, if a cartridge isn’t mounted in a drive, the data cannot be accessed or modified. This “air gap” is particularly attractive in light of the growing rate of data theft through cyberattacks.

The offline nature of tape also provides an additional line of defense against buggy software. For example, in 2011, a flaw in a software update caused Google to accidentally delete the saved email messages in about 40,000 Gmail accounts. That loss occurred despite there being several copies of the data stored on hard drives across multiple data centers. Fortunately, the data was also recorded on tape, and Google could eventually restore all the lost data from that backup.

The 2011 Gmail incident was one of the first disclosures that a cloud-service provider was using tape for its operations. More recently, Microsoft let it be known that its Azure Archive Storage uses IBM tape storage equipment.

All these pluses notwithstanding, the main reason why companies use tape is usually simple economics. Tape storage costs one-sixth the amount you’d have to pay to keep the same amount of data on disks, which is why you find tape systems almost anyplace where massive amounts of data are being stored. But because tape has now disappeared completely from consumer-level products, most people are unaware of its existence, let alone of the tremendous advances that tape recording technology has made in recent years and will continue to make for the foreseeable future.

All this is to say that tape has been with us for decades and will be here for decades to come. How can I be so sure? Read on.

Tape has survived for as long as it has for one fundamental reason: It’s cheap. And it’s getting cheaper all the time. But will that always be the case?

You might expect that if the ability to cram ever more data onto magnetic disks is diminishing, so too must this be true for tape, which uses the same basic technology but is even older. The surprising reality is that for tape, this scaling up in capacity is showing no signs of slowing. Indeed, it should continue for many more years at its historical rate of about 33 percent per year, meaning that you can expect a doubling in capacity roughly every two to three years. Think of it as a Moore’s Law for magnetic tape.

That’s great news for anyone who has to deal with the explosion in data on a storage budget that remains flat. To understand why tape still has so much potential relative to hard drives, consider the way tape and hard drives evolved.

Both rely on the same basic physical mechanisms to store digital data. They do so in the form of narrow tracks in a thin film of magnetic material in which the magnetism switches between two states of polarity. The information is encoded as a series of bits, represented by the presence or absence of a magnetic-polarity transition at specific points along a track. Since the introduction of tape and hard drives in the 1950s, the manufacturers of both have been driven by the mantra “denser, faster, cheaper.” As a result, the cost of both, in terms of dollars per gigabyte of capacity, has fallen by many orders of magnitude.

These cost reductions are the result of exponential increases in the density of information that can be recorded on each square millimeter of the magnetic substrate. That areal density is the product of the recording density along the data tracks and the density of those tracks in the perpendicular direction.

Early on, the areal densities of tapes and hard drives were similar. But the much greater market size and revenue from the sale of hard drives provided funding for a much larger R&D effort, which enabled their makers to scale up more aggressively. As a result, the current areal density of high-capacity hard drives is about 100 times that of the most recent tape drives.

Nevertheless, because they have a much larger surface area available for recording, state-of-the-art tape systems provide a native cartridge capacity of up to 15 TB—greater than the highest-capacity hard drives on the market. That’s true even though both kinds of equipment take up about the same amount of space.

With the exception of capacity, the performance characteristics of tape and hard drives are, of course, very different. The long length of the tape held in a cartridge—normally hundreds of meters—results in average data-access times of 50 to 60 seconds compared with just 5 to 10 milliseconds for hard drives. But the rate at which data can be written to tape is, surprisingly enough, more than twice the rate of writing to disk.

Over the past few years, the areal density scaling of data on hard disks has slowed from its historical average of around 40 percent a year to between 10 and 15 percent. The reason has to do with some fundamental physics: To record more data in a given area, you need to allot a smaller region to each bit. That in turn reduces the signal you can get when you read it. And if you reduce the signal too much, it gets lost in the noise that arises from the granular nature of the magnetic grains coating the disk.

It’s possible to reduce that background noise by making those grains smaller. But it’s difficult to shrink the magnetic grains beyond a certain size without compromising their ability to maintain a magnetic state in a stable way. The smallest size that’s practical to use for magnetic recording is known in this business as the superparamagnetic limit. And disk manufacturers have reached it.

Until recently, this slowdown was not obvious to consumers, because disk-drive manufacturers were able to compensate by adding more heads and platters to each unit, enabling a higher capacity in the same size package. But now both the available space and the cost of adding more heads and platters are limiting the gains that drive manufacturers can make, and the plateau is starting to become apparent.

There are a few technologies under development that could enable hard-drive scaling beyond today’s superparamagnetic limit. These include heat-assisted magnetic recording (HAMR) and microwave-assisted magnetic recording (MAMR), techniques that enable the use of smaller grains and hence allow smaller regions of the disk to be magnetized. But these approaches add cost and introduce vexing engineering challenges. And even if they are successful, the scaling they provide is, according to manufacturers, likely to remain limited. Western Digital Corp., for example, which recently announced that it will probably begin shipping MAMR hard drives in 2019, expects that this technology will enable areal density scaling of only about 15 percent per year.

In contrast, tape storage equipment currently operates at areal densities that are well below the superparamagnetic limit. So tape’s Moore’s Law can go on for a decade or more without running into such roadblocks from fundamental physics.

Still, tape is a tricky technology. Its removable nature, the use of a thin polymer substrate rather than a rigid disk, and the simultaneous recording of up to 32 tracks in parallel create significant hurdles for designers. That’s why my research team at the IBM Research–Zurich lab has been working hard to find ways to enable the continued scaling of tape, either by adapting hard-drive technologies or by inventing completely new approaches.

In 2015, we and our collaborators at FujiFilm Corp. showed that by using ultrasmall barium ferrite particles oriented perpendicular to the tape, it’s possible to record data at more than 12 times the density achievable with today’s commercial technology. And more recently, in collaboration with Sony Storage Media Solutions, we demonstrated the possibility of recording data at an areal density that is about 20 times the current figure for state-of-the-art tape drives. To put this in perspective, if this technology were to be commercialized, a movie studio, which now might need a dozen tape cartridges to archive all the digital components of a big-budget feature, would be able to fit all of them on a single tape.

To enable this degree of scaling, we had to make a bunch of technical advances. For one, we improved the ability of the read and write heads to follow the slender tracks on the tape, which were just 100 or so nanometers wide in our latest demo.

We also had to reduce the width of the data reader—a magnetoresistive sensor used to read back the recorded data tracks—from its current micro­meter size to less than 50 nm. As a result, the signal we could pick up with such a tiny reader got very noisy. We compensated by increasing the signal-to-noise ratio inherent to the media, which is a function of the size and orientation of the magnetic particles as well as their composition and the smoothness and slickness of the tape surface. To help further, we improved the signal processing and error-correction schemes our equipment employed.

To ensure that our new prototype media can retain recorded data for decades, we changed the nature of the magnetic particles in the recording layer, making them more stable. But that change made it harder to record the data in the first place, to the extent that a normal tape transducer could not reliably write to the new media. So we used a special write head that produces magnetic fields much stronger than a conventional head could provide.

Combining these technologies, we were able to read and write data in our laboratory system at a linear density of 818,000 bits per inch. (For historical reasons, tape engineers around the world measure data density in inches.) In combination with the 246,200 tracks per inch that the new technology can handle, our prototype unit achieved an areal density of 201 gigabits per square inch. Assuming that one cartridge can hold 1,140 meters of tape—a reasonable assumption, based on the reduced thickness of the new tape media we used—this areal density corresponds to a cartridge capacity of a whopping 330 TB. That means that a single tape cartridge could record as much data as a wheelbarrow full of hard drives.

In 2015,the Information Storage Industry Consortium, an organization that includes HP Enterprise, IBM, Oracle, and Quantum, along with a slew of academic research groups, released what it called the “International Magnetic Tape Storage Roadmap.” That forecast predicted that the areal density of tape storage would reach 91 Gb per square inch by 2025. Extrapolating the trend suggests that it will surpass 200 Gb per square inch by 2028.

The authors of that road map each had an interest in the future of tape storage. But you needn’t worry that they were being too optimistic. The laboratory experiments that my colleagues and I have recently carried out demonstrate that 200 Gb per square inch is perfectly possible. So the feasibility of keeping tape on the growth path it’s had for at least another decade is, to my mind, well assured.

Indeed, tape may be one of the last information technologies to follow a Moore’s Law–like scaling, maintaining that for the next decade, if not beyond. And that streak in turn will only increase the cost advantage of tape over hard drives and other storage technologies. So even though you may rarely see it outside of a black-and-white movie, magnetic tape, old as it is, will be here for years to come.

This article appears in the September 2018 print issue as “Tape Storage Mounts a Comeback.”

About the Author

Mark Lantz is manager of the Advanced Tape Technologies at IBM Research Zurich.

Advertisement

The $4 Billion Crypto Billionaire Who No One Has Heard Of – Forbes Now

A photo of Micree Zhan, cofounder and co-CEO of Bitmain Technologies.Bitmain Technologies

With rumors and news abuzz about its upcoming IPO, Chinese cryptocurrency mining chip firm Bitmain has been thrust into the spotlight in recent months. Jihan Wu, Bitmains cofounder and co-CEO, has gotten most of the media coverage, largely because of his prominent online social presence.

Meanwhile, Micree Zhan, Bitmains other cofounder and co-CEO, has kept a much lower profile but owns a much bigger stake in Bitmain than Wu does. Zhan is Bitmain’s technical mastermind and owns 36.58% of the company, while Wu’s stake is 20.5%, according to the leaked pre-IPO investor decks. The Beijing-based company is confident that it will achieve a $14 billion market capitalization once it offers shares to the public, according to its pre-IPO investor materials. That would make Zhan’s stake worth at least $5.1 billion and Wu’s stake worth nearly $2.9 billion.

Forbes isn’t so bullish. After talking with analysts, we estimate Zhan’s net worth at around $4 billion and Wu’s at close to $2.3 billion. Forbes valued Bitmain by applying price-to-sales ratios from comparable companies including Nvidia, AMD, Qualcomm, Mediatek and Cisco.

It is important to note that there are no directly comparable companies given the unique nature of Bitmain. We picked these companies because they are also fabless chip makers, companies that design and sell chips and hardware but outsource production to another manufacturer, per Daiwa Capital Markets analyst Rick Hsu’s guidance. Bitmain is not purely a semiconductor company; a portion of its revenue is also derived from its mining pools, BTC.com and Antpool. And given the novelty and volatility of the current cryptocurrency industry, the application of these mining chips is uncertain, according to Mark Li, a senior technology analyst at Sanford C. Bernstein & Co. Brett Simpson of Arete Research noted that Bitmain should be viewed as an early-stage version of Cisco, when there was plenty of uncertainty in the future of IP.

Despite his role as the technical backbone behind Bitmain, little is known about Zhan, an engineer who developed the custom application-specific integrated circuit (ASIC) chips that have propelled the global growth of the largest crypto-mining chip producer. Even his LinkedIn profile is stark, listing only “CEO at Bitmain” and nothing about prior jobs or education.

Here’s what we know so far. After graduating from Shandong University with a degree in electrical engineering in 2001, Zhan received a master’s in engineering from the Chinese Academy of Sciences’ Institute of Microelectronics in 2004. He went to work at Tsinghua University as a research and development engineer in the Research Institute of Information Technology.

In 2006, he began a new job as the head of research and development and manager of the integrated circuit department at Chinese company Unitend Technologies, which specializes in circuit design. At Unitend, Zhan oversaw the design and development of specific chips for digital television; the shipment volume of these chips exceeded 1 million during his time there. He’s also published numerous papers and patents about circuit chips and helped write the national standard for universal transport interfaces, a protocol used in digital television devices. In 2010, Zhan founded DivaIP Technologies, a Beijing-based startup to develop TV set-top boxes. He met Wu by chance when the startup was canvassing the streets of Beijing, and Zhan sought advice from Wu regarding funding, Quartz reports. Though Wu was unable to help in that specific regard, the two would meet up again three years later.

The younger of the duo, Wu graduated from Peking University in 2009 with a dual degree in economics and psychology. According to his LinkedIn profile, he worked as an investment manager at private equity firm China Grand Prosperity Investment from 2010 to 2013, before he cofounded Bitmain. A cofounder of 8BTC, a China-based Bitcoin forum launched in 2011, Wu is reportedly the first person to translate the bitcoin whitepaper—the original report written by Satoshi Nakamoto that explained the fundamentals of bitcoin—into Mandarin Chinese. He also drew attention to himself in 2016 when he tweeted a vitriolic response to someone who criticized his support for Bitcoin Cash.

In 2013, Wu approached Zhan about founding Bitmain together. Wu sought Zhan’s expertise in chip design to develop the mining chips that virtual currency mining necessitated. Mining bitcoins and other cryptocurrencies requires brute force that these ASIC chips provide, in order to solve the complex math problems that would verify transactions on a blockchain. The morning after their meeting, Zhan spent two hours poring through the Wikipedia page about bitcoin and promptly agreed to join the venture, Bloomberg reports.

Four years later, in 2017, Bitmain brought in $2.5 billion in revenue predominantly from sales of its cryptocurrency mining equipment, according to the investor decks. In the first quarter of 2018 alone, Bitmain had $1.9 billion in revenue. The company hopes to complete its IPO on the Hong Kong Stock Exchange before the end of the year, but little is known thus far. This could be a monumental step not just for Bitmain but also for the larger cryptocurrency community. The industry has been attempting to nudge its way back into the spotlight after a brief frenzy over cryptocurrencies that began at the end of 2017. After the leaking of the investor decks, however, doubt has arisen surrounding the feasibility of this IPO due to questions of its performance in the second and third quarters of 2018.

The First 3 Things ACOs Should Do with Their Data – Healthcare Informatics

With the U.S. healthcare system undergoing rapid, transformative change, one of the big unanswered questions is, what will happen to hospital-physician alignment over time? Many physicians, burdened by bureaucracy and practice management challenges, are fleeing into employment by hospitals or by large hospital-affiliated or hospital-owned physician groups, while others are entering into a variety of contractual relationships designed to keep them afloat in practice. In a sense, physician alignment is a sort of “wild card” in the emerging healthcare system. How will accountable care and value-based healthcare delivery and payment work—will physicians mostly participate in those arrangements simply as hospital and health system employees, or will they chart a different course, somewhere between the extreme autonomy they’ve had under discounted fee-for-service reimbursement, and straight hospital system employment? No one really knows for sure, and it appears that things could evolve forward distinctly in the diverse metropolitan and regional healthcare markets across the U.S.

Some of those issues were discussed in one of the articles in this year’s Healthcare Informatics Top Ten Tech Trends story package, in the third issue of this year, in the article “Markets and Medicine—Where Do Physicians Land, in the Emerging World of U.S. Healthcare?” And one of the industry leaders interviewed for that article was Tricia Nguyen, M.D., a senior executive at the Falls Church, Va.-based Inova Health System, and who came to Inova last year in order to help to lead and expand a clinically integrated network and joint-venture insurance company, that Inova had created with Aetna. “My title was CEO of Commonwealth Health Innovation Network,” she explains. “But as it turns out, I found within 30 days that we didn’t have much to scale, so I’ve been focused internally, and now lead our population health efforts under the title of senior vice president for population health.”

Tricia Nguyen, M.D.

Dr. Nguyen sees a host of challenges, as well as definite opportunities, in the near future, in terms of how to get physicians and hospital-based health systems on the same page, and aligned to partner for the emerging future of healthcare.

Speaking of the northern Virginia healthcare market in which the five-hospital Inova system has a significant market share, Nguyen says, “This market is in a bubble; 70 percent of the population is insured by some carrier, employer-sponsored generally,” she says. The system serves a very affluent population, with “double-income, double-degree families”; and Inova controls 60 to 70 percent market share in its area. Given that market dominance, she says, “For the system, there’s not a real pressure to change, but we saw a real opportunity in this joint venture with Aetna. Providers are still making a lot of money on the fee-for-service payment schedule, because so many of their patients are commercially covered, so they don’t have to deal with a lot of government products. Some family practice and internist physicians have a high percentage of Medicare, but many are no more than 50 percent Medicare. Many are 70 to nearly 100 percent commercial. So as far as fee for value, they’re worried about MIPS and MACRA, and they want help with that.”

The important revelation that’s emerging for her and her colleagues, Nguyen says, is the realization that physician alignment to date has been missing a key component. “The primary care physicians have essentially been doing value-based care for several years here, but only for the CareFirst population,” she says, referencing CareFirst, a regional BlueCross BlueShield company that offers a range of health plans across Maryland, the District of Columbia, and northern Virginia. “Have they changed practice patterns? A little bit, but not much.” And, importantly, she says, she and her colleagues are realizing that “Fuller value will come when we can identify the high-value specialists, those who are high-performing, low-cost, given the way they practice, and using them. That to me is the secret, and no one has that secret mastered quite yet. And that’s because the tools don’t really exist to help. I believe that ACOs are too focused on primary care, and that primary care has to bear the burden to drive down the costs of care, when in fact it’s going to take collaboration with the specialists. While care coordination is important for holistic health, to generate real savings, you’re going to have to drive down specialty care costs as well.”

Inova encompasses five acute-care facilities, and employs about 500 providers, about 120 of them primary care, the rest are specialists. Inside the broader umbrella of value-based contracting, Inova operates Signature Partners, an accountable care organization with about 34,00 lives, and with a clinically integrated network (CIN). Signature Partners has been in place for over three years. “This year, “Nguyen says, “we decided to split up into two ACOs; one is a high-performing Inova medical group; the other is the original ACO that we kept going. The 34,000 number encompasses both.

One of the challenges, Nguyen says, is that “The specialists are not yet thinking about value. But the primary care doctors have been on a semi-value journey. CareFirst has created a PCMH [patient-centered medical home] model and payment model for primary care, to keep them independent. They’ve been very successful in their model. They give PCPs a certain base, and it’s under Medicare rates. And if they deem you to be PCMH, they give you x bump in your fee schedule. Once you’ve met that, for the level of engagement and savings you generate, you get that dollar amount based on your engagement and quality, as a form of an additional increase in your fee schedule for the next year.”

One of the challenges, Nguyen notes, is around the geography of her organization’s service area. “CareFirst is a health plan. They’re in Maryland and cross over to northern Virginia. It’s interesting the relationship that CareFirst has with Anthem. There’s a Highway 123 in northern Virginia. And CareFirst does not cross 123, and neither does Anthem. The program that they’ve had in place since 2012, has created virtual pods with primary care physicians, where they aggregate them together and call them a pod, and have engagement leaders and care managers, and they’re incentivized to work with care managers from CareFirst. So the primary care physicians have essentially been doing value-based care for several years here, but only for the CareFirst population.” Thus, the moderate but still-modest change in practice patterns that has been elicited from that set of contractual relationships.

Is this an example of the proverbial “one foot in the boat, one foot on the shore” phenomenon that so many are witnessing in U.S. healthcare right now? “Yes, absolutely,” Nguyen says. The existence of so many different payment systems “has kind of forced the market to think this way. About 120 of our employed medical group has a large book of business with CareFirst, and so they act differently with those populations now. They treat them as though it’s a fee-for-service environment, but our own primary care practices within are also different. There was a practice we acquired about a year ago that’s probably the premier primary care group within CareFirst. They’re one of the most efficient; and I can say that because their operational incentive award amounts are very high, among the highest in CareFirst.”

Speaking of that specific group, Nguyen says, “When I look at their ACO performance per beneficiary spend, their target spend is on average about $7,500 at most; many could run $11,000 per member per year. The way they practice is just very different. They try to manage everything virtually, telephonically, etc. We’ve started to try to uncover and find areas of opportunity to spread across our medical group, but also across our CIN. Our CIN doctors don’t really have an incentive to change. We’ve been able to generate change within the group, but the MSSP has not generated savings, so I think they’re becoming a bit disillusioned.”

What is the secret of their success? “It’s really in identifying the high-value specialists, those who are high-performing, low-cost, given the way they practice, and using them. That to me is the secret, and no one has that secret mastered quite yet. And that’s because the tools don’t really exist to help. I believe that ACOs are too focused on primary care, and that primary care has to bear the burden to drive down the costs of care, when in fact it’s going to take collaboration with the specialists. While care coordination is important for holistic health, to generate real savings, you’re going to have to drive down specialty care costs as well. And we have about 100 cardiologists we employ; and in a population of 100,000, how many cardiologists do you need? I probably have 20 times the cardiologists I need; I’m just guessing about the precise proportion, but we have an oversupply.”

Given the complexity of that situation, what is the solution to the path into value? “The solution,” Nguyen says, “is that they’re going to have to start tiering their network—by physician and not by group. They’re stuck in contracting by group. CMS [the federal Centers for Medicare and Medicaid Services] and the private payers will have to get to the level of contracting at the individual specialist level. This is how contracts happen today—under the tax ID number; and the performance of the individual provider in a group gets mixed in with their peers. And so it’s impossible to get down to that level.”

So what can CMIOs, chief quality officers, and other health system leaders do, to promote change in this context? “They can engage in provider profiling at the MPI level,” Nguyen says firmly. “Health system CMIOs need to start thinking about hospital-based specialist performance data, and claims data, in a broader, more strategic context. Nobody’s done that yet. Everybody’s mired in the whole concept of integrating EHR [electronic health record] and claims data. But so far, integrating EHR and claims data has led only to more robust reporting on select measures, but it’s primary care-specific. There’s no integrated provider reporting across their EHR, practice management data and claims data, to understand specialty care.”

For example, Nguyen says, “Take a cardiologist who practices in the hospitals and also bills for services. There’s a set of activities they do in the inpatient space that could be integrated. For example, if a patient is admitted for an acute MI, does the cardiologist provide that care or does the cardiologist also bring other specialists in? If they’re in for an acute MI, they could manage the person’s condition with consultation with a hospitalist internist, with follow-up by a diabetologist, for example. But unless a person is in acute renal care, then they need acute care by a nephrologist. But that data that can measure and performance by that specialist is available in the hospital data; you can also see it in the claims data as well. But if you take the case of an orthopedic surgeon that does a procedure in the hospital, one surgeon could cost more than another based on the prosthetics and implants. But that data is wrapped into the DRG that the hospital gets paid, and the hospital gets dinged, not the physician. And so it’s not in the claims data.”

In other words, she says, “We need to think about whom we’re holding accountable for the cost of care. It’s a shared responsibility. You have to manage the referral network and guide members to the high-value specialist. There are some basic things they can do with chronic conditions; but they must collaborate with their specialist peers.”

Does Dr. Nguyen have any other advice for CMIOs and CIOs? “Yes,” she says. “Don’t over-invest in EHR data for ACO quality measures at this time; focus on claims data. Everyone dismisses claims data. Inova has over-invested in EHR data that hasn’t yet generated savings. Claims data will help generate savings. Measuring quality, EHR yes, but a lot can be done through claims, if they’d just use the G-codes. And ACOs today are the price-takers from the payers and plans; they really should be the price-makers. And say, it’s going to cost $300 PMPM [per member per month] and not say, I’m going to take a percentage of premium, because percentage of premium could be a very arbitrary number.”

Wilson’s drone tech flying high – Greenville Daily Reflector

WILSON — Reconstructing car crashes, surveying utility damage and evaluating flooding are just a few of the times a drone comes in handy for city of Wilson staff. Thanks to a grant, officials are testing an app that will allow multiple departments to view drone footage.

“We’d been working on traditional uses for drones such as 3D modeling to more quickly document and allow police to clear a crash scene,” said Greenlight Community Broadband General Manager Will Aycock. “We realized it was useful to take that feed and not only capture it but stream it to the operation center and other staff, so we applied for this grant to allow us to develop the platform to transition from just data collection to data streaming.”

US Ignite, a national nonprofit working to promote the integration of smart city technology, selected Wilson for a $10,000 grant late last year, allowing the city to partner with Triangle UAS to develop an app that provides for secure, real-time streaming of video and other data from the city’s three drones.

“One of the immediate-use scenarios we’re talking about is doing assessments. Say we have damage to a feeder or a transition line, especially in more rural areas, we can get eyes on it with the drone, share the info back in the office, which will help us get the right resources headed in the right direction quicker,” Aycock said. “One of the really neat things with the app is we developed it with security and privacy in mind, so we can loop in multiple people. If we need police, electric and gas staff, we can loop those people in wherever they are through the app, and there is even the potential for public access if there is an incident that should go out to the public.”

Rebecca Agner, Wilson communications and marketing director, said in the event of flooding or natural disasters, footage from the drones could eliminate the risk to residents who want to see the damage firsthand.

“During the April 2017 flooding and flooding from Hurricane Matthew, we were getting footage out to the public through the city and the newspaper, and that was without drones,” she said. “We heard from police that doing so really helped keep people out of unsafe driving conditions, so I think the city absolutely would make that feed available during large-scale events in the future.”

Triangle UAS has partnered with the city of Wilson in the past through a Gig East meetup event that focused on residential, commercial and government uses for drones. Aycock said prior to the grant, many of the necessary components — such as streaming video to the drone operator’s tablet — existed, but Triangle UAS put the components together into an easy-to-use app.

“One of the other byproducts of this project is our development of the best practices, forms and manuals required to have a proper drone program within a city, and that’ll be shared across the state,” Aycock said.

Scott Turnbull, national technology leader for US Ignite, said Wilson is not only a leader of smart technology integration across the state, but across the nation.

“Wilson really is at the forefront of thinking on integrating technology,” Turnbull said. “Other cities are thinking about it and they’ve got some drone hobbyists in different departments, but Wilson is leading the way.”

Agner said smart technology as well as the city’s drone integration will be highlighted at the next Gig East, a tech conference scheduled for Oct. 30 at the Edna Boykin Cultural Center. For more information on the event, visit www.gigeast.com.

“I think drones will be like GIS (geographic information system) technology 20 years ago. We used it for very, very specific functions, but as we started to use it, we saw all the possibilities, so the use of GIS exploded,” Aycock said. “I think this — not just drones, but the internet of things — will be very similar, particularly when you look at all of the functions of drones with all the sensors they can be outfitted with, so it really can be shared across all city operations.”

The prototype was functioning this spring, but it was finished for testing in the last month. Aycock said the rollout of the app and integration of the city’s four drone operators will take place in the coming months.

“The technology is ready to go, but we’re still assessing it in controlled environments before it is ready to be deployed in real-world scenarios,” Aycock said. “Because we’re a Gigabit City, we’re developing the technology through this grant, but we’ll also have access to other bright ideas from other communities.”

WP Facebook Auto Publish Powered By : XYZScripts.com