What is 21st Century Cures Act? – Definition from WhatIs.com – TechTarget

The 21st Century Cures Act is a wide-ranging healthcare bill that funds medical research and development, medical device innovation, mental health research and care, opioid addiction treatment and prevention, and health information technology. The legislation provides, over 10 years, $4.8 billion to the National Institutes of Health over 10 years, $500 million to the Food and Drug Administration and $1 billion in grants to states to fight opioid addiction.

The bill, known as “Cures,” was approved by large bipartisan majorities in both the House and the Senate and was signed into law by President Barack Obama on Dec. 13, 2016. The Cures bill also significantly loosens FDA regulation of the development of pharmaceutical drugs and advanced medical devices and eliminates FDA regulation of low-risk health apps.

However, the act makes funding of most of the provisions contingent on Congress’ reallocating money for them each year. This mechanism helped secure support from fiscally conservative lawmakers who were worried that the spending bill would further strain the federal budget deficit.

Several high-profile critics among the 31 lawmakers who didn’t vote for the bill, including Democratic senators Bernie Sanders of Vermont and Elizabeth Warren of Massachusetts, criticized the bill as a giveaway to the pharmaceutical and medical device industries.

Mental and behavioral health advocates say the bill is the first major advance in a decade in funding research and treatment for people with mental illness and intervening in the early stages of psychosis, a promising development.

Some critics, though, argued that the act went too far in relaxing some privacy provisions in the name of better treatment.

Goals of the legislation

Congress intended Cures – with its broad reach across medical research and development, drugs and devices, mental health and health IT – to be a definitive step toward modernizing the U.S. healthcare system and recognizing the central roles of technology and science.

The bill also recognizes the importance of and provides funding for cutting edge research projects such as the Precision Medicine Initiative, BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies), Regenerative Medicine Innovation Project and Cancer Moonshot.

The 21st Century Act FDA language includes measures to streamline the design of clinical trials and expedite the approval of medical devices that have demonstrated potential to treat unmet medical needs and life-threatening conditions.

Impact on health IT

One of the 21st Century Cures Act interoperability provisions was among the law’s first to be put into action, with the January 2018 announcement by the Office of the National Coordinator for Health Information Technology (ONC) of the Trusted Exchange Framework.

The law includes specific language directing ONC to set up the framework – a nationwide system for sharing health data among networks run by healthcare systems, health information exchanges, insurers and other healthcare organizations.

Some of the law’s other health IT interoperability measures clarify ONC’s authority to certify health IT software to ensure that electronic protected health information is transferred securely and patients have unfettered access to their own health data.

Cures also gives the government authority to bar vendors and healthcare organizations from the practice known as information blocking, or impeding the flow of health data among healthcare providers, networks, vendors and patients.

Impact on mental health

The 21st Century Cures mental health impact is considerable, according to the American Psychiatric Association.

Among other provisions, the legislation:

  • Establishes a new position, the assistant secretary for mental health and substance abuse, intended to coordinate fragmented mental resources across the federal government
  • Creates another new position, chief medical officer at the Substance Abuse and Mental Health Services Administration (SAMHSA)
  • Requires SAMHSA to develop a strategic plan every four years to better recruit, train and retain mental health and substance abuse disorder workers
  • Reauthorizes grants to support integrated care for mental and behavioral health, train mental health workers in evidence-based care, and fund college, university and professional programs to expand internships and field placement programs
  • Strengthens enforcement of the Mental Health Parity and Addiction Equity Act of 2008, which mandates that insurers treat mental and physiological health issues equally

History

The bill was introduced on Jan. 6, 2015 by Rep. Suzanne Bonamici, D-Ore.

The House approved the bill on Jan. 7, 2015 and Senate approved it on Oct. 6, 2015 with an amendment and sent it back to the House, which agreed to the amendment on Nov. 30, 2016.

On Dec. 7, 2016, the Senate agreed to a new House amendment, and on Dec. 8, 2017 the bill was presented to the president, who signed it into law five days later.

Congress should close the loophole allowing warrantless digital car searches – TechCrunch

Most Americans expect the Fourth Amendment — which protects individuals from illegal searches — to extend to their digital lives.

In general, this expectation matches reality: unless law enforcement comes knocking with a warrant, the government cannot search a person’s phone or computer. However, cars are treated differently, and as “connected cars” become increasingly linked to people’s digital identities, there is a risk that police will use this exception to conduct digital searches without warrants.

Congress should close this loophole.

The Fourth Amendment is the cornerstone of people’s right to privacy and freedom from government intrusion in the United States. It requires the government to get a warrant based on probable cause before conducting a search and seizure of personal property.

The Supreme Court has found these protections important enough to update them for the digital world. For example, the court has extended warrant protections to cell phones and vehicle GPS tracking, and it is currently reviewing whether law enforcement officials should be required to get a warrant to obtain cellphone location information from wireless carriers.

However, there has been a long-standing exception for vehicles in the Fourth Amendment: law enforcement officials can stop and search a vehicle based on probable cause without having to get a warrant from a judge.

Photo: Joseph C. Justice Jr./Getty Images

For example, police officers can stop a vehicle for a routine traffic violation, and search it on the spot if the officers have probable cause that they will find contraband or the evidence of a crime. This lower standard for government searches makes sense in a physical world, where vehicles can only hold so much information and drivers can easily drive away to dispose of evidence.

But cars are changing, both in term of the amount and sensitivity of the information they can hold. Next-generation vehicles generate gigabytes of data while driving, enabling a host of new applications that enhance convenience, safety, and efficiency for drivers.

When this information can be accessed either through a display interface in the car or programmatically through an on-board computer, law enforcement could gain access to a significant amount of data about drivers without a warrant. For example, police could access in-car apps that contain sensitive information, such as navigation apps that contain travel history, social media apps that store messages and other personal information, and payment apps that contain information about past purchases.

While some of these applications require passwords, many only do so when the driver first logs in. Therefore, they would likely be unlocked when police pull over a driver.

In addition, many drivers may be intimidated into revealing their passwords during a stop, as has happened to travelers forced to unlock their phones at border crossings.

Finally, police could retrieve information stored in an on-board computer which may collect and store a variety of potentially sensitive information about drivers, including their driving behavior. Already, some police use special devices designed to circumvent built-in security measures on citizens’ phones and quickly copy their contents — similar devices could be designed for cars.

Photo: bjdlzx/Getty Images

Despite these potential risks, a car’s ability to collect information is not inherently privacy-invasive. And importantly, the automotive industry has taken pains to protect consumer privacy. For example, automakers made a series of public commitments in 2014 to establish strict privacy standards for data collected from vehicles, promising not to share consumer information with other businesses without affirmative consent — a standard that is higher than those found in other industries.

However, the auto industry cannot change the laws on digital searches. Policymakers should close this loophole to protect both citizens’ rights and support for technological progress. Congress has previously acted to close loopholes created by technological change.

For example, the Electronic Communications Privacy Act (ECPA), which limits how law enforcement can access digital information has different legal standards for obtaining email stored on a PC and email stored in the cloud. As cloud computing adoption has grown, Congress has worked to pass a legislative fix.

Just as Congress has been working to close the loophole for cloud computing, it should close the loophole created by the convergence of digital technology with vehicles. Congress should require law enforcement officials to obtain a warrant before they can access data from a vehicle.

Congress can do this while maintaining the vehicle exception for physical searches and maintaining law enforcement’s access to data held by third parties, such as automakers or wireless providers, through warrants or other lawful processes.

By upholding citizen privacy, Congress can ensure a smooth road ahead for vehicles of the future.

The New Rules of Talent Management – Harvard Business Review

Agile isn’t just for tech anymore. It’s been working its way into other areas and functions, from product development to manufacturing to marketing—and now it’s transforming how organizations hire, develop, and manage their people.

You could say HR is going “agile lite,” applying the general principles without adopting all the tools and protocols from the tech world. It’s a move away from a rules- and planning-based approach toward a simpler and faster model driven by feedback from participants. This new paradigm has really taken off in the area of performance management. (In a 2017 Deloitte survey, 79% of global executives rated agile performance management as a high organizational priority.) But other HR processes are starting to change too.

In many companies that’s happening gradually, almost organically, as a spillover from IT, where more than 90% of organizations already use agile practices. At the Bank of Montreal (BMO), for example, the shift began as tech employees joined cross-functional product-development teams to make the bank more customer focused. The business side has learned agile principles from IT colleagues, and IT has learned about customer needs from the business. One result is that BMO now thinks about performance management in terms of teams, not just individuals. Elsewhere the move to agile HR has been faster and more deliberate. GE is a prime example. Seen for many years as a paragon of management through control systems, it switched to FastWorks, a lean approach that cuts back on top-down financial controls and empowers teams to manage projects as needs evolve.

The changes in HR have been a long time coming. After World War II, when manufacturing dominated the industrial landscape, planning was at the heart of human resources: Companies recruited lifers, gave them rotational assignments to support their development, groomed them years in advance to take on bigger and bigger roles, and tied their raises directly to each incremental move up the ladder. The bureaucracy was the point: Organizations wanted their talent practices to be rules-based and internally consistent so that they could reliably meet five-year (and sometimes 15-year) plans. That made sense. Every other aspect of companies, from core businesses to administrative functions, took the long view in their goal setting, budgeting, and operations. HR reflected and supported what they were doing.

By the 1990s, as business became less predictable and companies needed to acquire new skills fast, that traditional approach began to bend—but it didn’t quite break. Lateral hiring from the outside—to get more flexibility—replaced a good deal of the internal development and promotions. “Broadband” compensation gave managers greater latitude to reward people for growth and achievement within roles. For the most part, though, the old model persisted. Like other functions, HR was still built around the long term. Workforce and succession planning carried on, even though changes in the economy and in the business often rendered those plans irrelevant. Annual appraisals continued, despite almost universal dissatisfaction with them.

Now we’re seeing a more sweeping transformation. Why is this the moment for it? Because rapid innovation has become a strategic imperative for most companies, not just a subset. To get it, businesses have looked to Silicon Valley and to software companies in particular, emulating their agile practices for managing projects. So top-down planning models are giving way to nimbler, user-driven methods that are better suited for adapting in the near term, such as rapid prototyping, iterative feedback, team-based decisions, and task-centered “sprints.” As BMO’s chief transformation officer, Lynn Roger, puts it, “Speed is the new business currency.”

With the business justification for the old HR systems gone and the agile playbook available to copy, people management is finally getting its long-awaited overhaul too. In this article we’ll illustrate some of the profound changes companies are making in their talent practices and describe the challenges they face in their transition to agile HR.

Where We’re Seeing the Biggest Changes

Because HR touches every aspect—and every employee—of an organization, its agile transformation may be even more extensive (and more difficult) than the changes in other functions. Companies are redesigning their talent practices in the following areas:

Performance appraisals.

When businesses adopted agile methods in their core operations, they dropped the charade of trying to plan a year or more in advance how projects would go and when they would end. So in many cases the first traditional HR practice to go was the annual performance review, along with employee goals that “cascaded” down from business and unit objectives each year. As individuals worked on shorter-term projects of various lengths, often run by different leaders and organized around teams, the notion that performance feedback would come once a year, from one boss, made little sense. They needed more of it, more often, from more people.

An early-days CEB survey suggested that people actually got less feedback and support when their employers dropped annual reviews. However, that’s because many companies put nothing in their place. Managers felt no pressing need to adopt a new feedback model and shifted their attention to other priorities. But dropping appraisals without a plan to fill the void was of course a recipe for failure.

Since learning that hard lesson, many organizations have switched to frequent performance assessments, often conducted project by project. This change has spread to a number of industries, including retail (Gap), big pharma (Pfizer), insurance (Cigna), investing (OppenheimerFunds), consumer products (P&G), and accounting (all Big Four firms). It is most famous at GE, across the firm’s range of businesses, and at IBM. Overall, the focus is on delivering more-immediate feedback throughout the year so that teams can become nimbler, “course-correct” mistakes, improve performance, and learn through iteration—all key agile principles.

In user-centered fashion, managers and employees have had a hand in shaping, testing, and refining new processes. For instance, Johnson & Johnson offered its businesses the chance to participate in an experiment: They could try out a new continual-feedback process, using a customized app with which employees, peers, and bosses could exchange comments in real time.

The new process was an attempt to move away from J&J’s event-driven “five conversations” framework (which focused on goal setting, career discussion, a midyear performance review, a year-end appraisal, and a compensation review) and toward a model of ongoing dialogue. Those who tried it were asked to share how well everything worked, what the bugs were, and so on. The experiment lasted three months. At first only 20% of the managers in the pilot actively participated. The inertia from prior years of annual appraisals was hard to overcome. But then the company used training to show managers what good feedback could look like and designated “change champions” to model the desired behaviors on their teams. By the end of the three months, 46% of managers in the pilot group had joined in, exchanging 3,000 pieces of feedback.

Regeneron Pharmaceuticals, a fast-growing biotech company, is going even further with its appraisals overhaul. Michelle Weitzman-Garcia, Regeneron’s head of workforce development, argued that the performance of the scientists working on drug development, the product supply group, the field sales force, and the corporate functions should not be measured on the same cycle or in the same way. She observed that these employee groups needed varying feedback and that they even operated on different calendars.

So the company created four distinct appraisal processes, tailored to the various groups’ needs. The research scientists and postdocs, for example, crave metrics and are keen on assessing competencies, so they meet with managers twice a year for competency evaluations and milestones reviews. Customer-facing groups include feedback from clients and customers in their assessments. Although having to manage four separate processes adds complexity, they all reinforce the new norm of continual feedback. And Weitzman-Garcia says the benefits to the organization far outweigh the costs to HR.

Coaching.

The companies that most effectively adopt agile talent practices invest in sharpening managers’ coaching skills. Supervisors at Cigna go through “coach” training designed for busy managers: It’s broken into weekly 90-minute videos that can be viewed as people have time. The supervisors also engage in learning sessions, which, like “learning sprints” in agile project management, are brief and spread out to allow individuals to reflect and test-drive new skills on the job. Peer-to-peer feedback is incorporated in Cigna’s manager training too: Colleagues form learning cohorts to share ideas and tactics. They’re having the kinds of conversations companies want supervisors to have with their direct reports, but they feel freer to share mistakes with one another, without the fear of “evaluation” hanging over their heads.

DigitalOcean, a New York–based start-up focused on software as a service (SaaS) infrastructure, engages a full-time professional coach on-site to help all managers give better feedback to employees and, more broadly, to develop internal coaching capabilities. The idea is that once one experiences good coaching, one becomes a better coach. Not everyone is expected to become a great coach—those in the company who prefer coding to coaching can advance along a technical career track—but coaching skills are considered central to a managerial career.

P&G, too, is intent on making managers better coaches. That’s part of a larger effort to rebuild training and development for supervisors and enhance their role in the organization. By simplifying the performance review process, separating evaluation from development discussions, and eliminating talent calibration sessions (the arbitrary horse trading between supervisors that often comes with a subjective and politicized ranking model), P&G has freed up a lot of time to devote to employees’ growth. But getting supervisors to move from judging employees to coaching them in their day-to-day work has been a challenge in P&G’s tradition-rich culture. So the company has invested heavily in training supervisors on topics such as how to establish employees’ priorities and goals, how to provide feedback about contributions, and how to align employees’ career aspirations with business needs and learning and development plans. The bet is that building employees’ capabilities and relationships with supervisors will increase engagement and therefore help the company innovate and move faster. Even though the jury is still out on the companywide culture shift, P&G is already reporting improvements in these areas, at all levels of management.

Teams.

Traditional HR focused on individuals—their goals, their performance, their needs. But now that so many companies are organizing their work project by project, their management and talent systems are becoming more team focused. Groups are creating, executing, and revising their goals and tasks with scrums—at the team level, in the moment, to adapt quickly to new information as it comes in. (“Scrum” may be the best-known term in the agile lexicon. It comes from rugby, where players pack tightly together to restart play.) They are also taking it upon themselves to track their own progress, identify obstacles, assess their leadership, and generate insights about how to improve performance.

In that context, organizations must learn to contend with:

Multidirectional feedback. Peer feedback is essential to course corrections and employee development in an agile environment, because team members know better than anyone else what each person is contributing. It’s rarely a formal process, and comments are generally directed to the employee, not the supervisor. That keeps input constructive and prevents the undermining of colleagues that sometimes occurs in hypercompetitive workplaces.

But some executives believe that peer feedback should have an impact on performance evaluations. Diane Gherson, IBM’s head of HR, explains that “the relationships between managers and employees change in the context of a network [the collection of projects across which employees work].” Because an agile environment makes it practically impossible to “monitor” performance in the old sense, managers at IBM solicit input from others to help them identify and address issues early on. Unless it’s sensitive, that input is shared in the team’s daily stand-up meetings and captured in an app. Employees may choose whether to include managers and others in their comments to peers. The risk of cutthroat behavior is mitigated by the fact that peer comments to the supervisor also go to the team. Anyone trying to undercut colleagues will be exposed.

In agile organizations, “upward” feedback from employees to team leaders and supervisors is highly valued too. The Mitre Corporation’s not-for-profit research centers have taken steps to encourage it, but they’re finding that this requires concentrated effort. They started with periodic confidential employee surveys and focus groups to discover which issues people wanted to discuss with their managers. HR then distilled that data for supervisors to inform their conversations with direct reports. However, employees were initially hesitant to provide upward feedback—even though it was anonymous and was used for development purposes only—because they weren’t accustomed to voicing their thoughts about what management was doing.

Mitre also learned that the most critical factor in getting subordinates to be candid was having managers explicitly say that they wanted and appreciated comments. Otherwise people might worry, reasonably, that their leaders weren’t really open to feedback and ready to apply it. As with any employee survey, soliciting upward feedback and not acting on it has a diminishing effect on participation; it erodes the hard-earned trust between employees and their managers. When Mitre’s new performance-management and feedback process began, the CEO acknowledged that the research centers would need to iterate and make improvements. A revised system for upward feedback will roll out this year.

Because feedback flows in all directions on teams, many companies use technology to manage the sheer volume of it. Apps allow supervisors, coworkers, and clients to give one another immediate feedback from wherever they are. Crucially, supervisors can download all the comments later on, when it’s time to do evaluations. In some apps, employees and supervisors can score progress on goals; at least one helps managers analyze conversations on project management platforms like Slack to provide feedback on collaboration. Cisco uses proprietary technology to collect weekly raw data, or “breadcrumbs,” from employees about their peers’ performance. Such tools enable managers to see fluctuations in individual performance over time, even within teams. The apps don’t provide an official record of performance, of course, and employees may want to discuss problems face-to-face to avoid having them recorded in a file that can be downloaded. We know that companies recognize and reward improvement as well as actual performance, however, so hiding problems may not always pay off for employees.

Frontline decision rights. The fundamental shift toward teams has also affected decision rights: Organizations are pushing them down to the front lines, equipping and empowering employees to operate more independently. But that’s a huge behavioral change, and people need support to pull it off. Let’s return to the Bank of Montreal example to illustrate how it can work. When BMO introduced agile teams to design some new customer services, senior leaders weren’t quite ready to give up control, and the people under them were not used to taking it. So the bank embedded agile coaches in business teams. They began by putting everyone, including high-level executives, through “retrospectives”—regular reflection and feedback sessions held after each iteration. These are the agile version of after-action reviews; their purpose is to keep improving processes. Because the retrospectives quickly identified concrete successes, failures, and root causes, senior leaders at BMO immediately recognized their value, which helped them get on board with agile generally and loosen their grip on decision making.

Complex team dynamics. Finally, since the supervisor’s role has moved away from just managing individuals and toward the much more complicated task of promoting productive, healthy team dynamics, people often need help with that, too. Cisco’s special Team Intelligence unit provides that kind of support. It’s charged with identifying the company’s best-performing teams, analyzing how they operate, and helping other teams learn how to become more like them. It uses an enterprise-wide platform called Team Space, which tracks data on team projects, needs, and achievements to both measure and improve what teams are doing within units and across the company.

Compensation.

Pay is changing as well. A simple adaptation to agile work, seen in retail companies such as Macy’s, is to use spot bonuses to recognize contributions when they happen rather than rely solely on end-of-year salary increases. Research and practice have shown that compensation works best as a motivator when it comes as soon as possible after the desired behavior. Instant rewards reinforce instant feedback in a powerful way. Annual merit-based raises are less effective, because too much time goes by.

Patagonia has actually eliminated annual raises for its knowledge workers. Instead the company adjusts wages for each job much more frequently, according to research on where market rates are going. Increases can also be allocated when employees take on more-difficult projects or go above and beyond in other ways. The company retains a budget for the top 1% of individual contributors, and supervisors can make a case for any contribution that merits that designation, including contributions to teams.

Upward feedback from employees to team leaders is valued in agile organizations.

Compensation is also being used to reinforce agile values such as learning and knowledge sharing. In the start-up world, for instance, the online clothing-rental company Rent the Runway dropped separate bonuses, rolling the money into base pay. CEO Jennifer Hyman reports that the bonus program was getting in the way of honest peer feedback. Employees weren’t sharing constructive criticism, knowing it could have negative financial consequences for their colleagues. The new system prevents that problem by “untangling the two, ” Hyman says.

DigitalOcean redesigned its rewards to promote equitable treatment of employees and a culture of collaboration. Salary adjustments now happen twice a year to respond to changes in the outside labor market and in jobs and performance. More important, DigitalOcean has closed gaps in pay for equivalent work. It’s deliberately heading off internal rivalry, painfully aware of the problems in hypercompetitive cultures (think Microsoft and Amazon). To personalize compensation, the firm maps where people are having impact in their roles and where they need to grow and develop. The data on individuals’ impact on the business is a key factor in discussions about pay. Negotiating to raise your own salary is fiercely discouraged. And only the top 1% of achievement is rewarded financially; otherwise, there is no merit-pay process. All employees are eligible for bonuses, which are based on company performance rather than individual contributions. To further support collaboration, DigitalOcean is diversifying its portfolio of rewards to include nonfinancial, meaningful gifts, such as a Kindle loaded with the CEO’s “best books” picks.

How does DigitalOcean motivate people to perform their best without inflated financial rewards? Matt Hoffman, its vice president of people, says it focuses on creating a culture that inspires purpose and creativity. So far that seems to be working. The latest engagement survey, via Culture Amp, ranks DigitalOcean 17 points above the industry benchmark in satisfaction with compensation.

Recruiting.

With the improvements in the economy since the Great Recession, recruiting and hiring have become more urgent—and more agile. To scale up quickly in 2015, GE’s new digital division pioneered some interesting recruiting experiments. For instance, a cross-functional team works together on all hiring requisitions. A “head count manager” represents the interests of internal stakeholders who want their positions filled quickly and appropriately. Hiring managers rotate on and off the team, depending on whether they’re currently hiring, and a scrum master oversees the process.

To keep things moving, the team focuses on vacancies that have cleared all the hurdles—no req’s get started if debate is still ongoing about the desired attributes of candidates. Openings are ranked, and the team concentrates on the top-priority hires until they are completed. It works on several hires at once so that members can share information about candidates who may fit better in other roles. The team keeps track of its cycle time for filling positions and monitors all open requisitions on a kanban board to identify bottlenecks and blocked processes. IBM now takes a similar approach to recruitment.

Companies are also relying more heavily on technology to find and track candidates who are well suited to an agile work environment. GE, IBM, and Cisco are working with the vendor Ascendify to create software that does just this. The IT recruiting company HackerRank offers an online tool for the same purpose.

Learning and development.

Like hiring, L&D had to change to bring new skills into organizations more quickly. Most companies already have a suite of online learning modules that employees can access on demand. Although helpful for those who have clearly defined needs, this is a bit like giving a student the key to a library and telling her to figure out what she must know and then learn it. Newer approaches use data analysis to identify the skills required for particular jobs and for advancement and then suggest to individual employees what kinds of training and future jobs make sense for them, given their experience and interests.

IBM uses artificial intelligence to generate such advice, starting with employees’ profiles, which include prior and current roles, expected career trajectory, and training programs completed. The company has also created special training for agile environments—using, for example, animated simulations built around a series of “personas” to illustrate useful behaviors, such as offering constructive criticism.

Traditionally, L&D has included succession planning—the epitome of top-down, long-range thinking, whereby individuals are picked years in advance to take on the most crucial leadership roles, usually in the hope that they will develop certain capabilities on schedule. The world often fails to cooperate with those plans, though. Companies routinely find that by the time senior leadership positions open up, their needs have changed. The most common solution is to ignore the plan and start a search from scratch. But organizations often continue doing long-term succession planning anyway. (About half of large companies have a plan to develop successors for the top job.) Pepsi is one company taking a simple step away from this model by shortening the time frame. It provides brief quarterly updates on the development of possible successors—in contrast to the usual annual updates—and delays appointments so that they happen closer to when successors are likely to step into their roles.

Ongoing Challenges

To be sure, not every organization or group is in hot pursuit of rapid innovation. Some jobs must remain largely rules based. (Consider the work that accountants, nuclear control-room operators, and surgeons do.) In such cases agile talent practices may not make sense.

And even when they’re appropriate, they may meet resistance—especially within HR. A lot of processes have to change for an organization to move away from a planning-based, “waterfall” model (which is linear rather than flexible and adaptive), and some of them are hardwired into information systems, job titles, and so forth. The move toward cloud-based IT, which is happening independently, has made it easier to adopt app-based tools. But people issues remain a sticking point. Many HR tasks, such as traditional approaches to recruitment, onboarding, and program coordination, will become obsolete, as will expertise in those areas.

Meanwhile, new tasks are being created. Helping supervisors replace judging with coaching is a big challenge not just in terms of skills but also because it undercuts their status and formal authority. Shifting the focus of management from individuals to teams may be even more difficult, because team dynamics can be a black box to those who are still struggling to understand how to coach individuals. The big question is whether companies can help managers take all this on and see the value in it.

The HR function will also require reskilling. It will need more expertise in IT support—especially given all the performance data generated by the new apps—and deeper knowledge about teams and hands-on supervision. HR has not had to change in recent decades nearly as much as have the line operations it supports. But now the pressure is on, and it’s coming from the operating level, which makes it much harder to cling to old talent practices.

DOST: Advancing science, technology agenda best option for PHL growth – Business Mirror

Last updated on February 21st, 2018 at 09:10 pm

IN a recent meeting with members of the Makati Business Club and several foreign chambers of commerce, the government’s chief scientist Fortunato dela Peña encouraged local and foreign businessmen to invest in technology-related enterprises. Dela Peña, Secretary of the Department of Science and Technology (DOST), said this is relevant as the government is now investing heavily in science and technology.

The chief scientist cited advancement in health and medicine development with the county’s numerous traditional medicinal herbs as focus, education, energy, disaster resiliency, and climate-change adaptation, including enterprises that deal with creativity such as designs.

A civil servant for two decades now holding various teaching and civil-service positions up to his appointment as top official of the DOST under the present administration, Dela Peña said that for a time, the country seemed to have grown “resistant” to science- and technology-related endeavors, although a core number of advocates persisted in pushing the science agenda.

The progressive minds seemed to have prevailed, the DOST official said.

Recently, the Philippines ranked 73rd out of 128 economies in terms of Science and Technology and Innovation (STI) index, citing the country’s strength in research and commercialization of STI ideas. The report also said that 60 percent of companies in the country offer training to improve the technical skill of their employees.

Investment

HOWEVER, a study by the Philippine Institute for Development Studies highlighted the weak ties between innovation-driven firms and the government, and it also identified the country’s low expenditure in research and development (R&D).

According to Dela Peña, this is the reason the government is now extending all its efforts to reach out with the private sector, explaining that STI plays an important role in economic and social progress and is a key driver for a long-term growth of an economy.

Technology adoption, the official said, allows a country’s firms and citizens to benefit from innovations created in other countries, and allows it to catch up and even leap-frog obsolete technologies.

This can lead to significant improvements in productivity for existing firms in agriculture, industry and services.

For one, long-term investments in building local capacity for technology generation can lead to innovations that will give local firms a competitive advantage. This can result in the creation of new firms and even entirely new industries.

For another, the local medicine sector has been showing potential, the DOST official said, citing the case of two dozen local herbs or medicinal plants being studied as one example.

Lagundi

WHEN asked about the case of Lagundi (Vitex negundo), whose efficacy as medicine is being challenged by drug manufacturers, DOST Assistant Secretary for International Cooperation Leah J. Buendia said that the shrub was subjected to 20 years of stringent clinical trials and has been proven consistently as effective.

“But since the DOST, and the government for that matter, is not into commercialization, private companies are the ones who manufacture the component of the medicinal plant into commercially available medicines,” Buendia said, adding that the agency only gives the results which include the right formula and volume of the medicinal component.

Asked on the possibility that private manufacturers might knowingly dilute the required strength for the medicine to be effective to cut cost and unwittingly made the medicine commercially available ineffective, the official declined to comment.

She, however, assured that the private sector is working with the agency for the purpose of commercializing discoveries made and studied by the DOST in partnership with the private sector, as these “products would not help private companies profit but advance the country’s agenda.”

The science agenda that, despite advances, is still in need of prioritization and more funding. This agenda is in the Philippine Development Plan 2017-2022, which devotes an entire chapter on STI.

STI culture

RECENT positive developments and advancement in science and technology notwithstanding, there remains a low level of innovation in the country. This is brought by weaknesses in STI human capital, low R&D expenditure and weak linkages in the STI ecosystem.

In the Global Innovation Index (GII) Report last year, the Philippines ranked 74th among 128 economies in terms of overall innovation, garnering a score of 31.8 out of 100. This is a slight improvement from the score of 31.1, ranking 83rd out of 141 economies in 2015.

The country also ranked fifth out of seven members of the Association of Southeast Asian Nations (Asean) that were included in the survey. The Philippines was ahead of Cambodia (95th) and Indonesia (88th) but lagged behind Singapore (6th), Malaysia (35th), Thailand (52nd) and Vietnam (59th).

The factors behind the weak performance of the STI sector include a weak STI culture, Dela Peña said.

There is lack of public awareness and interest in STI and many sectors do not recognize, appreciate and understand the use of technology- and science-based information in their daily activities.

There’s also a number of weaknesses in social and professional culture, i.e., research culture in universities, commercialization of results from public research, among others. According to Dela Peña, a lack of awareness on intellectual property rights, in the research community and the general public, still persists.

Despite its availability, adoption and application of technologies among micro, small and medium enterprises (MSMEs) and sectors like agriculture and fisheries remains low, he added.

Research

LOW government and private spending on STI is another factor behind the weak performance of the STI sector, according to the GII report.

Investments in R&D are central for enhancing the country’s innovation ecosystem, the report said. Expenditures on R&D and innovation activities, as well as the support given to the development of human resources in various fields of science and technology (S&T), are the parameters scrutinized in the monitoring and evaluation of STI.

While nominal R&D expenditures increased by 80 percent to P15.92 billion in 2013, the proportion of R&D spending to GDP stood at only 0.14 percent. This is substantially below the 1-percent benchmark recommended by the United Nations Educational, Scientific and Cultural Organization (Unesco) and the global average of 2.04 percent. It is also low compared to other Asean countries, such as Vietnam, 0.19 percent, Thailand with 0.36 percent, Malaysia with 1.09 and Singapore’s 2.0 percent. The data is available online from Unesco’s Institute for Statistics.

The country’s relatively low ranking in the GII Report was pulled down by weaknesses in human capital and R&D, with a score of 22.7 out of 100, ranking 95th. This is due to the low public and private expenditures on education and R&D, as well as low tertiary inbound mobility. Tertiary inbound mobility refers to the number of students from abroad studying in a given country, as a percentage of the total tertiary or college enrollment.

The bulk of the R&D spending, about 60 percent, comes from the public sector. These were directed to agricultural and industrial production and technology, protection and improvement of human health, control and care of the environment, among others. Most of the R&D activities in the country are still concentrated in the National Capital Region, Calabarzon and Central Luzon.

Manpower

ANOTHER indicator measuring the capacity for technology generation is the number of S&T human resources engaged in R&D.

As of 2013, the country has a total of 36,517 R&D personnel, of which 26,495 are key researchers, scientific, technological and engineering personnel engaged in R&D; the rest are technicians and support personnel.

The figures denote that there are only 270 researchers for every one million Filipinos. Such ratio falls short of the Unesco norm of 380 per million population and the 1,020 researchers per million population average across developing economies of East Asia and the Pacific.

Of the total researchers in the country from the government, higher educational institutions (HEIs) and private nonprofit sectors, 14 percent had doctoral degrees (PhD), 38 percent had master’s degrees, while 34 percent had Bachelor of Science (BS) up to post-BS degrees. The low number of researchers in the country reflects the propensity of the educational system in the country to produce graduates outside of science, technology, engineering and mathematics, or Stem, programs—the disciplines where R&D flourishes. Nevertheless, the latest GII report indicates that in terms of graduates in science and engineering, the country garnered a score of 25.5 out of 100, ranking 26th.

Capital

AN assessment of the country’s innovation system conducted by a program of the United States Agency for International Development (Usaid) reveals that the supply of Stem graduates exceeds local demand.

As a result, there is an out-migration and, worse, underemployment of many skilled, locally trained scientists and engineers. The report by the Usaid’s Science, Technology, Research and Innovation for Development, or Stride, program also cited a shortage in training for fields critical for innovation, particularly in information technology. Such shortage contributes to the challenge that many local companies face, especially in securing employees with the skills required to grow the business.

This somewhat explains the nature of brain drain the country has. It is not so much because of Filipinos not being “nationalistic” but simply because there is limited opportunity for people of science to stay in the country.

However, Buendia said the issue of nationalism has some credence, if not the absolute answer, citing the case of South Korea in the 1950s.

When South Korea was in its lowest in terms of economic level, the government called on all its scientists and engineers scattered around the world to go home and help build their economy, and many responded, she said.

According to Buendia, this brain drain contributes to the problem as potential researchers, scientists and engineers, the key actors for the innovation ecosystem to flourish, prefer to seek employment overseas due to better economic opportunities and potential for advancement. Since knowledge and technology are mostly embodied in human resources, this emphasizes the urgency to accelerate the development of R&D human resource.

Patents

THE output of R&D is commonly measured in terms of patents applied and granted to Filipino residents.

However, reports show that many universities do not have the expertise to market their patent portfolios for commercial use. Furthermore, technology generators face persisting issues on technology ownership while researchers are constrained by the “publish or perish” phenomenon.

This results in the weak technology transfer system in the country.

An annual average of 209 patent utility models and 597 industrial design applications were filed from 2005 to 2015. In the same period, an annual average of 54 patents, 446 utility models and 502 industrial designs were granted.

In 2016, the World Economic Forum (WEF) ranked the Philippines 86th out of 128 economies for the number of patents filed under the Patent Cooperation Treaty per million population. Invention patents granted to local inventors represent the smallest share in number of intellectual properties granted from 2001 to 2003. Industrial design and utility models consistently comprise the majority of the intellectual property granted.

The country also needs to catch up in research publications since the number of scientific publication in peer-reviewed journals per million population stands at 55, substantially below that of Asean member-states like Singapore with its staggering 10,368, Malaysia with 1,484, Thailand with 478 and Vietnam with 105.

Ecosystem

ANOTHER factor behind the weak performance of the STI sector is the weak linkages among players in the STI ecosystem.

The 2009 survey of Innovation Activities and the 2014 Usaid-Stride Assessment of the Philippine Innovation Ecosystem discovered that innovation actors have weak cooperation, partnerships and trust among themselves. Most HEIs perceive collaboration with companies as outside of their core missions and a risk to exploitation.

Consequently, firms report that difficulties in convincing HEIs of their shared interests stem from resentment, suspicion and distrust. In effect, firms end up with little technical assistance from the government and research institutions.

Another factor in this equation is restrictive regulations that hamper implementation of R&D programs and projects.

The tedious government procurement process hobbles the immediate procurement of equipment and other needed materials for research, which, in turn, delays the implementation of R&D projects, the GII report said. This was confirmed by the Usaid-Stride study, which revealed that restrictive regulations make the procurement of equipment and consumables for research extremely slow and unnecessarily complex, decreasing research productivity, publication potential, and speed-to-market of innovation.

In addition, the report said the government research grants do not compensate universities for the salary of faculty members’ research activities. This practice is rarely seen outside the Philippines.

The final factor in the weak performance of the STI sector is inadequacy of an STI infrastructure that includes laboratory facilities, testing facilities and R&D centers.

Many existing hubs need upgrading to improve their services, which contributes to the lack of absorptive capacity in research institutions, the Usaid-Stride report said. It also cited that the public institutions failed to provide young researchers with equipment packages, particularly those returning from PhD studies abroad with more advanced research agendas.

The country’s leading research institutions also remain concentrated in Luzon.

Hopes

DESPITE the many inadequacies, from funding to human capital, there are some technology-intensive research and capacity-building projects which resulted in products which are currently being used successfully.

One is the micro-satellite.

In April 2016, the country launched into space its first micro-satellite called Diwata-1. It was designed, developed and assembled by Filipino researchers and engineers under the guidance of Japanese experts. The Diwata (deity in English) satellite provides real-time, high-resolution and multi-color infrared images for various applications, including meteorological imaging, crop and ocean productivity measurement and high-resolution imaging of natural and man-made features.

It enables a more precise estimate of the country’s agricultural production, provides images of watersheds and floodplains for a better understanding of water available for irrigation, power and domestic consumption. The satellite also provides accurate information on any disturbance and degradation of forest and upland areas.

The country also has the Nationwide Operational Assessment of Hazards (Noah), which uses the Lidar (light detection and ranging) technology. Project NOAH was initiated in June 2012 to help manage risks associated with natural hazards and disasters. The project developed hydromet sensors and high-resolution geo-hazard maps, which were generated by light detection and ranging technology for flood modeling.

Noah helps the government in providing timely warning with a lead time of at least six hours in the wake of impending floods.

According to Buendia, the country is now training the Cambodians on this technology, as part of the partnerships among Asean countries, just like in the case of Japan which assisted the country’s scientists and engineers in building its first micro-satellite.

Another hope lies in the so-called Intelligent Operation Center Platform.

Established through a collaboration between the local government of Davao City and IBM Philippines Inc., the center resulted in the creation of a dashboard that allows authorized government agencies, such as police, fire and anti-terrorism task force, to use analytics software for monitoring events and operations in real time.

Initiatives

THE DOST, in cooperation with HEIs and research institutions, established advanced facilities that seek to spur R&D activities and provide MSMEs access to testing services needed to increase their productivity and competitive advantage.

One is the Advanced Device and Materials Testing Laboratories. The center houses advanced equipment for failure analysis and materials characterization to address advanced analytical needs for quality control, materials identification and R&D. Closely related to this facility is the Electronics Products Development Center, used to design, develop and test hardware and software for electronic products.

There are also high-performance computing facilities that perform tests and run computationally intensive applications for numerical weather prediction, climate modeling, as well as analytics and data modeling and archiving.

The Philippines could also boast of its Genome Center, a core facility that combines basic and applied research for the development of health diagnostics, therapeutics, DNA forensics and preventive products, and improved crop varieties.

According to Buendia, the country also has drug-discovery facilities, which address the requirements for producing high-quality and globally acceptable drug candidates. She said the Philippines also has nanotechnology centers, which provide technical services and enabling environment for interdisciplinary and collaborative R&D in various nanotechnology applications.

Buendia said there are also radiation processing facilities that are used to degrade, graft, or crosslink polymers, monomers, or chemical compounds for industrial, agricultural, environmental and medical applications. The Philippines could also boast of its Die and Mold Solutions Center, which enhances the competitiveness of the local tool and die sector through the localization of currently imported dies and molds.

These reflect that we are advancing, albeit slowly, to a culture that embraces STI as a sure path to growth, according to Dela Peña.

Image Credits: LEREMY | DREAMSTIME.COM

Alladin S. Diega

Alladin S. Diega is working with BusinessMirror as a correspondent since 2013. Mr. Diega is currently covering the Metro Page. He studied BS Journalism at the Lyceum of the Philippines and has also worked with various non-government organizations. He is currently into breeding African Lovebirds as a hobby.

30 Incredibly Useful Things You Didn’t Know Slack Could Do – Fast Company

4. Save your place in a channel or direct message by holding down the Alt or Option key and then clicking any timestamp (or long-pressing on a timestamp from the mobile app and selecting “Mark unread”). The next time you open that thread on any device, you’ll be taken directly to the point you set.

5. Get where you need to go faster by letting Slack’s Quick Switcher beam you directly to any channel, direct message, or team—no clicking or navigation required. Just hit Ctrl-K or Cmd-K and type the first letter of your desired destination. You’ll see a list of options appear and can select the one you want or type additional letters to narrow it down further. (Bonus tip: If you’re using one of the Slack desktop apps, Ctrl-T or Cmd-T will do the same thing. Take your pick!)

6. Make it easy to find important messages or files by pinning them to a channel or private message. Click the three-dot menu icon above a message (or long-press it from the mobile app) and select “Pin to this conversation.” The item will then appear within a pushpin icon at the top of the thread, where everyone will see and be able to access it as needed.

Put a pin in something important so everyone can find it with ease.

7. Catch up on activity quickly by putting Slack’s All Unreads feature to use. First, make sure the feature is activated within the Sidebar section of the desktop app’s preferences. Then just look for the “All Unreads” line in the sidebar, and start there to see a single centralized list of everything you haven’t read.

8. When a channel gets too cluttered for you to focus, let Slack hide all the inline image previews—including, yes, any animated GIFs your Giphy-loving colleagues have summoned. Type “/collapse” to send ’em all a-packin’, then type “/expand” if and when you want everything back.

Get Your Message Across

9. You may know about Slack’s basic text-formatting commands—asterisks around text for bold, underscores for italics, but the app also has advanced options that’ll let you further transform the appearance of your words. To wit:

The Case Against Google – The New York Times

Standard Oil’s technological discoveries gave the company huge advantages over its rivals, and Rockefeller exploited those advantages ruthlessly. He cut secret deals with railroads so that other firms had to pay more for transportation. He forced smaller refineries to choose between selling out to him or facing bankruptcy. “Rockefeller and his associates did not build the Standard Oil Co. in the boardrooms of Wall Street,” wrote Ida Tarbell, a muckraking journalist of the day. “They fought their way to control by rebate and drawback, bribe and blackmail, espionage and price cutting, and perhaps more important, by ruthless, never slothful efficiency of organization.”

In 1906, President Theodore Roosevelt ordered his Justice Department to sue Standard Oil for antitrust violations. But government lawyers faced a quandary: It wasn’t illegal for Standard Oil to be a monopoly. It wasn’t even illegal to compete mercilessly. So government prosecutors found a new argument: If a firm is more powerful than everyone else, they said, it can’t simply act like everyone else. Instead, it has to live by a special set of rules, so that other companies get a fair shot. “The theory was that competition is good, and if a monopoly extinguishes competition, that’s bad,” says Herbert Hovenkamp, co-author of a seminal treatise on antitrust law. “Once you become a monopoly, you have to start acting differently, and if you don’t, then what you’ve been doing all along starts breaking the law.”

The Supreme Court agreed and split Standard Oil into 34 firms. (Rockefeller received stock in all of them and became even wealthier.) In the decades following the Standard Oil breakup, antitrust enforcement generally abided by a core principle: When a company grows so powerful that it becomes a gatekeeper, and uses that might to undermine competitors, then the government should intervene. And in the last century, as courts have censured other monopolies, academics and jurists have noticed a pattern: Monopolies and technology often seem intertwined. When a company discovers a technological advantage — like the innovations of Rockefeller’s scientists — it sometimes makes that firm so powerful that it becomes a monopoly almost without trying very hard. Many of the most important antitrust lawsuits in American history — against IBM, Alcoa, Kodak and others — were rooted in claims that one company had made technological discoveries that allowed it to outpace competitors.

For decades, there seemed to be a consensus among policymakers and business leaders (though not always among targeted companies) about how the antitrust laws should be enforced. But around the turn of this century, a number of tech companies emerged that caused some people to question whether the antitrust formula made sense anymore. Firms like Google and Facebook have become increasingly useful as they have grown bigger and bigger — a characteristic known as network effects. What’s more, some have argued that the online world is so fast-moving that no antitrust lawsuit can keep pace. Nowadays even the biggest titan can be defeated by a tiny start-up, as long as the newcomer has better ideas or faster tech. Antitrust laws, digital executives said, aren’t needed anymore.

Consider Microsoft. The government spent most of the 1990s suing Microsoft for antitrust violations, a prosecution that many now view as a complete waste of time and money. When Microsoft’s chief executive, Bill Gates, signed a consent decree to resolve one of its monopoly investigations in 1994, he told a reporter that it was essentially pointless for the company’s various divisions: “None of the people who run those divisions are going to change what they do or think.” Even after a federal judge ordered Microsoft broken into separate companies in 2000, the punishment didn’t take. Microsoft fought the ruling and won on appeal. The government then offered a settlement so feeble that nine states begged the court to reject the proposal. It was approved.

What eventually humbled Bill Gates and ended Microsoft’s monopoly wasn’t antitrust prosecutions, observers say, but a more nimble start-up named Google, a search engine designed by two Stanford Ph.D. dropouts that outperformed Microsoft’s own forays into search (first MSN Search and now Bing). Then those two dropouts introduced a series of applications, like Google Docs and Google Sheets, that eventually began to compete with almost every aspect of Microsoft’s businesses. And Google did all that not by relying on government prosecutors but by being smarter. You don’t need antitrust in the digital marketplace, critics argue. “When our products don’t work or we make mistakes, it’s easy for users to go elsewhere because our competition is only a click away,” Google’s co-founder, Larry Page, said in 2012. Translation: The government ought to stop worrying, because no online giant will ever survive any longer than it deserves to.

Once Foundem.com was available to everyone, the company’s honeymoon lasted precisely two days. During its first 48 hours, the Raffs saw a rush of traffic from users typing product queries into Google and other search engines. But then, suddenly, the traffic stopped. Alarmed, Adam and Shivaun began running diagnostics. They quickly discovered that their site, which until then had been appearing near the top of search results, was now languishing on Google, mired 12 or 15 or 64 or 170 pages down. On other search engines, like MSN Search and Yahoo, Foundem still ranked high. But on Google, Foundem had effectively disappeared. And Google, of course, was where a vast majority of people searched online.

As Amazon competition heats up, D.C. mayor heads west to talk tech – The Washington Post

The new European data protection law requires us to inform you of the following before you use our website:

We use cookies and other technologies to customize your experience, perform analytics and deliver personalized advertising on our sites, apps and newsletters and across the Internet based on your interests. By clicking “I agree” below, you consent to the use by us and our third-party partners of cookies and data gathered from your use of our platforms. See our Privacy Policy and Third Party Partners to learn more about the use of data and your rights. You also agree to our Terms of Service.

I agree

Innovator Awards Program 2018: Semifinalists | Healthcare Informatics Magazine – Healthcare Informatics

Editor’s Note: We at Healthcare Informatics were once again ecstatic with the exceptional quality of the submissions we received from innovating patient care organizations across the U.S. In addition to the four winning teams this year (whose stories will be posted throughout this week), our editorial team also selected several runners-up. Below, please find descriptions of the initiatives of the 14 teams whom we have awarded semifinalist status in this year’s program.

Centre for Addiction and Mental Health (Ontario, Canada)

Improving patient care through achievement of HIMSS EMRAM Stage 7

In May 2014, CAMH implemented a clinical information system using a big-bang approach with an integrated team of clinicians, information technology, and other staff. But after implementation, CAMH noted a lack of clinical practice standardization. A new initiative emerged that included work to refine the inputting of clinical documentation to the EHR, the development of electronic whiteboards to display and manage assessment and risk factors, leveraging data to inform improvement initiatives, and many other requirements as defined by the HIMSS EMRAM (EMR Adoption Model) Stage 7 criteria.

CAMH is the first academic teaching hospital to achieve HIMSS Stage 7 in Canada; this achievement is a milestone in both the Canadian and international health landscape. Now, more than 99 percent of CAMH clinically-relevant documentation is completed directly within the EHR and CPOE (computerized provider order entry) rates have been over 90 percent since December 2016. What’s more, the creation of a suicide risk dashboard has led to 90 percent of patients having a suicide risk assessment completed within 24 hours of admission.

Cleveland Clinic (Cleveland, Ohio)

An enterprise imaging service

The goal of the enterprise imaging service is to provide a comprehensive longitudinal medical record through incorporation of all medical images into a single archive. Through a universal viewer, the archive is integrated with the EHR and provides a foundation for image distribution to all caregivers throughout the enterprise. The archive also serves as a foundation for image sharing. Implementation required a comprehensive assessment of all image generating equipment throughout all hospitals and outpatient centers.

The Clinic’s officials say that the establishment of an enterprise imaging program has led to the consolidation of imaging archives throughout the health system. Images which were not previously easily accessible are now readily viewable through the EHR (electronic health record) with access points both within the firewall and from home. To date, 11 different service lines and more than 440 pieces of image generating equipment outside of radiology have been integrated.

Compass Medical (Massachusetts)

Annual wellness, chronic care management and quality outcomes

By leveraging new information technology, Compass Medical has been able to follow proven population health management and care management principles, allowing patient care leaders to identify and target specific population groups, stratify and prioritize care gaps and engage and individualize care plan activities. In 2016, for example, Compass Medical was able to identify and target more than 14,000 Medicare patients that were struggling to manage their chronic health conditions and needed a more personalized and comprehensive care plan. One year later, Compass Medical developed and launched a new Chronic Care Management Program to help engage with and closely manage Medicare patients that suffer from two or more chronic health conditions. With the help of its EHR and big data platform, Compass Medical positioned itself to automate many of the workflows for care management nurses.

The Annual Wellness Visit (AWV) is another example of a preventive care service that has been positively affected by leveraging IT. In 2017, national trends suggested utilization of AWV were still hovering around the low 20-percent range with the highest performing state reaching 35 percent. Utilizing EHR-based patient engagement campaigns for increasing focused outreach, incorporating a team based care model with scribes, and creating standard work processes for reducing provider burden have helped Compass Medical reach 57 percent AWV utilization for its Medicare eligible population by the end of year 2017.

Duke University School of Medicine (Durham, N.C.)

A NICU discrete event simulation model

Duke’s neonatal clinicians care for more than 800 babies each year in the Duke Neonatal Intensive Care Unit (NICU). Although the majority do well, about 40 babies do not survive. How could they improve outcomes and save lives? Duke’s neonatal research team partnered with analytics company SAS to create an analytics-based model of Duke Children’s Hospital’s Level IV neonatal intensive care unit. The result was the creation of a discrete event simulation model that closely resembled the clinical outcomes of Duke’s training unit, which was validated using data held back from the original model, which also closely tracked actual unit outcomes.

The model uses a vast resource of clinical data to simulate the experience of patients, their conditions and staff responses in a computerized environment. It creates virtual babies experiencing care within a simulated NICU environment, including virtual beds staffed by virtual nurses. The research team attests that they cannot find any evidence of discrete event simulation modeling being used in a NICU setting, making this a first in neonatal care.

Houston Methodist (Houston, Texas)

A coordinated care/Medicare Shared Savings Program (MSSP) initiative

Houston Methodist’s MSSP program, Houston Methodist Coordinated Care (HMCC), can track and report Medicare patients’ healthcare visits and medical details. The successful execution of the program is a layering of technologies with the foundation being the organization’s integrated EHR platform and a separate population management tool.

The project was centered around six core elements: 1) becoming the first ACO (accountable care organization) in Texas to acquire real-time admission, discharge and transfer (ADT) notification capability that links all health providers; 2) chronic heart failure home monitoring; 3) real-time notification when HMCC patients came into the ED; 4) risk assessments for emergency room visits, hospital readmissions and the need for complex care; 5) same-day appointment facilitation; and 6) care team alerts. In sum, there were 17,000 Houston Methodist patients in HMCC in 2017, year-to-date, with 105 participating physicians. Total healthcare cost savings year-to-date are more than $1.3 million, according to officials.

Indiana University Health (Indianapolis, Ind.)

FHIR HIEdrant: making big data actionable at the point of care

One of the difficult challenges for many HIEs (health information exchanges) is the time and effort that it takes to reach out to a second system to search for needed data at the point of care. As such, the goal at IU Health was to develop an application within the clinical workflow that will, at the click of a single button, bring back data to that workflow relating to the patient’s chief complaint from the HIE.

The first phase of this project was building the framework and the mechanisms to make this a possibility and apply it to a single context: an emergency department patient with chest pain. Leaders at IU Health are utilizing the Fast Healthcare Interoperability Resources (FHIR) standard to communicate out from the IU Health Cerner EHR to the HIE to retrieve five specific data elements that are germane to caring for a chest pain patient in the emergency department and understanding their risk. Within the workflow, the clinician is being presented the most recent: ECG, cardiology note, discharge summary, catheterization report, and more. According to IU Health officials, this is the first FHIR-based application that directly accesses an HIE and delivers context-specific data about a patient directly to the clinical workflow.

Johns Hopkins Health System (Baltimore, Md.)

inHealth precision medicine initiative

The precision medicine initiative at Johns Hopkins Medicine and University–inHealth–seeks to improve individual and population health outcomes through scientific advances at the intersection of biomedical research and data science. Through a collaboration of The Johns Hopkins Applied Physics Laboratory (APL), and Johns Hopkins Medicine (JHM), inHealth is building a big-data precision medicine platform with the goal of accelerating the translation of insight into care delivery.

The first result of this broad, multidisciplinary effort was the successful creation of two Precision Medicine Centers of Excellence (PMCoE) focused on multiple sclerosis and prostate active surveillance. The organization’s Technology Innovation Center has developed applications to garner new data and learnings from clinical practice and feedback into discovery. Physicians have begun using the discovery platform to facilitate conversations with their patients about their treatment options and risks. The experiences of these centers will lead the next wave of PMCoEs, expanding the utility of the platform.

Lakeland Health (St. Joseph, Mich.)

Something wicked this way comes

Leaders at Lakeland Health set three core cybersecurity goals: (a) put risk management and cybersecurity near the top of health system leadership agenda; (b) use innovative strategies and tools to execute the cybersecurity program; and (c) shift focus from fear to clinical integrity. The cybersecurity program covered the hospitals, clinics, home care, hospice and all the different legal entities which comprised the health system. In order to ensure strategic direction and alignment, a steering committee was set up which met every two weeks.

The cybersecurity program execution was focused on three work-streams—process, technology and team members. In the process work stream, execution covered implementation and audit of policies and procedures, risk assessment and HIPAA (Health Insurance Portability and Accountability Act) compliance, and a monthly information security executive dashboard which was reviewed by the steering committee. Despite this continuing threat, the cybersecurity program delivered strong results in different areas, including: more than 100 business associate agreements (BAA) were signed; annual HIPAA risk assessment and remediation plans were put in place; the initial internal phishing campaign eventually lowered the click rate to 10 percent; there was a five-fold increase in the suspicious emails forwarded to the security team; and more than 1,000 laptops were encrypted.

Lexington Clinic (Lexington, Ky.)

Development of a direct-to-employer network

Costs of certain services often vary dramatically between providers, so by selectively designing benefits to increase cost-sharing at providers who provide more expensive care, enrollees are incentivized to see the more efficient providers who provide care at a lower cost, reducing average overall expenditure. Savings can then be passed on to the employer. In this project, implementation was examined with an organization with a self-funded insurance model. Steering beneficiaries toward a tighter network of providers resulted in significant overall reductions in expenditure while improving the health of the overall employee population. Rather than limiting their employee health plan to a lower percentage of area providers like most similar plan designs, the employer entered into a direct-to-employer program with a local, multispecialty physician group: Lexington Clinic.

A key component of a direct-to-employer plan is population health. Lexington Clinic was able to utilize analytics software to deliver value to the employer by implementing high cost/high utilization analysis, undetected chronic disease engagement, and ancillary modality management. Lexington Clinic also determined that there were specific interventions that could be made at critical junctures in the care continuum of the employee population. These interventions would be designed to prevent health issues before they arise, reducing future expenditures and worsened health outcomes. Via the Lexington Clinic premier network, the employer demonstrated a clear reduction in aggregate expenditure from the 2015 to 2016 time period of more than 4 percent.

Lutheran Medical Center (Wheat Ridge, Colo.)

An app for staff engagement

At Lutheran Medical Center, it became a priority to redesign the way in which the staff was engaged. The organization started to use an anonymous crowdsourcing platform in 2016 with the goal to create recipes for success that would help leaders in the organization ask the right questions through an anonymous tool to enhance engagement. Using the tool has established a venue for staff to engage in problem solving and design ideas on their own terms in an anonymous way where all can follow along in the conversation in real time. The application/website started its use in the pharmacy department as a means to understand low engagement scores. This tool allowed for all staff to be involved while not taking them away from their daily duties.

Lutheran Medical Center was in the 11th percentile when it came to staff satisfaction only two years ago, but now ranks in the 43rd percentile compared to the national average. Having the ability to get staff buy-in before a change happens has been critical in impacting staff satisfaction. Before, only those invited to certain meetings had the opportunity to voice their opinions; now everyone can be reached with a single email or app use. It has been used for solving several clinical problems as well, such as how to design a “cord-free” patient room, and how to transport oxygen tanks around the hospital.

Mercy (St. Louis, Mo.)

Using NLP for heart failure EHR documentation

The goal of this project was to use NLP to extract key cardiology measures from physician and other clinical notes and incorporate the results into a dataset with discrete data fields. This dataset would then be used to obtain actionable information and contribute to the evaluation of outcomes of medical devices in heart failure patients.

Three key measures that are commonly stored in clinical notes and not available in discrete fields include ejection fraction measurement; patient symptoms including dyspnea, fatigue, dizziness, palpitations, edema and pulmonary congestion; and the New York Heart Association (NYHA) heart failure classification. Mercy patients had 35.5 million clinical notes from both inpatient and outpatient encounters that were extracted, processed and then loaded onto an NLP server. NLP queries were developed by a team of Mercy data scientists to search for relevant linguistic patterns and then evaluated for both precision and recall. The use of NLP in this project facilitated the extraction of vital patient information that is not available in any discrete field in the EHR. Without the ability to track the changes in these three essential measures, it becomes much harder to identify the point of disease progression which is a crucial factor for the evaluation of current treatments and could inform future interventions, according to Mercy officials.

Mosaic Life Care (St. Joseph, Mo.)

Revenue management analytics dashboard

Mosaic Life Care provides healthcare and life care services in and around St. Joseph, Missouri and the Kansas City Northland area. The organization’s finance and revenue cycle teams faced challenges with data silos that required caregivers to manually obtain information from disparate systems and manually collate information, subjecting the process to human error, inconsistent processes and concerns about data accuracy.

With the goal of developing a flexible “source of truth” dashboard, the enterprise data warehouse team developed an integrated revenue management analytics solution with a front-end dashboard by leveraging the core EDW solution and architecture platform to extract data from the best of breed systems. Through the new dashboard, financial analysts and management teams can perform analysis and predict future trends. As a result, the dashboard enables real-time, data-driven business decisions inclusive of multi-disparate systems within a single unified platform.

NYU Langone Health (New York City, N.Y.)

Value-based medicine to improve clinical care

The goal of the project was to leverage health IT tools and related workflows to improve the value of inpatient care. Finance collaborated with the project’s physician champions to identify variations in care both internally and compared to benchmarked external institutions. The project’s physician champions collaborated with IT physician informaticists and IT project teams to design interventions to both reduce cost and improve clinical care.

The suite of interventions included: electronic clinical pathways; blood protocols; intravenous (IV) to oral (PO) medication changes; and lab ordering enhancements. Electronic pathways were created for heart failure, colon surgery, and pneumonia, and blood ordering clinical decision support and analytics were built. These projects realized significant two-year savings, including: electronic clinical pathways: $12.9 million; lab modifications: $3 million; blood utilization: $2.9 million; and IV to PO: $2.2 million.

Penn Medicine (Philadelphia, Pa.)

Standard clinical iPhone effectively enhances patient care

In January 2016, Penn Medicine met with Apple engineers to develop an economic and efficient full configuration Standard Clinical iPhone (SCiP) to work with Penn’s mobile device management tool while leveraging Apple’s Device Enrollment Program (DEP) and Volume Purchasing Program (VPP). Using this method saved the organization 975 man hours in its initial deployment using DEP streamlined setup (15 minutes versus one hour for each iPhone). Pushing additional apps to devices without needing an Apple ID and password for download or manual touch also made the implementation efficient. By implementing this project, it placed a vital tool into caregivers’ hands, officials say.

Just by using the secure texting, clinicians were able to coordinate patient flow across care settings with multiple providers and mended gaps in communication. One example had the cardiac surgery team on a thread with the patient’s current caregivers. The nurse took a picture of the surgical site and sent it securely with a description, concerned about swelling around the surgical site. The surgical team was able to provide immediate feedback and resolve the issue remotely.

Innovator Awards Program 2018: Semifinalists – Healthcare Informatics

Editor’s Note: We at Healthcare Informatics were once again ecstatic with the exceptional quality of the submissions we received from innovating patient care organizations across the U.S. In addition to the four winning teams this year (whose stories will be posted throughout this week), our editorial team also selected several runners-up. Below, please find descriptions of the initiatives of the 14 teams whom we have awarded semifinalist status in this year’s program.

Centre for Addiction and Mental Health (Ontario, Canada)

Improving patient care through achievement of HIMSS EMRAM Stage 7

In May 2014, CAMH implemented a clinical information system using a big-bang approach with an integrated team of clinicians, information technology, and other staff. But after implementation, CAMH noted a lack of clinical practice standardization. A new initiative emerged that included work to refine the inputting of clinical documentation to the EHR, the development of electronic whiteboards to display and manage assessment and risk factors, leveraging data to inform improvement initiatives, and many other requirements as defined by the HIMSS EMRAM (EMR Adoption Model) Stage 7 criteria.

CAMH is the first academic teaching hospital to achieve HIMSS Stage 7 in Canada; this achievement is a milestone in both the Canadian and international health landscape. Now, more than 99 percent of CAMH clinically-relevant documentation is completed directly within the EHR and CPOE (computerized provider order entry) rates have been over 90 percent since December 2016. What’s more, the creation of a suicide risk dashboard has led to 90 percent of patients having a suicide risk assessment completed within 24 hours of admission.

Cleveland Clinic (Cleveland, Ohio)

An enterprise imaging service

The goal of the enterprise imaging service is to provide a comprehensive longitudinal medical record through incorporation of all medical images into a single archive. Through a universal viewer, the archive is integrated with the EHR and provides a foundation for image distribution to all caregivers throughout the enterprise. The archive also serves as a foundation for image sharing. Implementation required a comprehensive assessment of all image generating equipment throughout all hospitals and outpatient centers.

The Clinic’s officials say that the establishment of an enterprise imaging program has led to the consolidation of imaging archives throughout the health system. Images which were not previously easily accessible are now readily viewable through the EHR (electronic health record) with access points both within the firewall and from home. To date, 11 different service lines and more than 440 pieces of image generating equipment outside of radiology have been integrated.

Compass Medical (Massachusetts)

Annual wellness, chronic care management and quality outcomes

By leveraging new information technology, Compass Medical has been able to follow proven population health management and care management principles, allowing patient care leaders to identify and target specific population groups, stratify and prioritize care gaps and engage and individualize care plan activities. In 2016, for example, Compass Medical was able to identify and target more than 14,000 Medicare patients that were struggling to manage their chronic health conditions and needed a more personalized and comprehensive care plan. One year later, Compass Medical developed and launched a new Chronic Care Management Program to help engage with and closely manage Medicare patients that suffer from two or more chronic health conditions. With the help of its EHR and big data platform, Compass Medical positioned itself to automate many of the workflows for care management nurses.

The Annual Wellness Visit (AWV) is another example of a preventive care service that has been positively affected by leveraging IT. In 2017, national trends suggested utilization of AWV were still hovering around the low 20-percent range with the highest performing state reaching 35 percent. Utilizing EHR-based patient engagement campaigns for increasing focused outreach, incorporating a team based care model with scribes, and creating standard work processes for reducing provider burden have helped Compass Medical reach 57 percent AWV utilization for its Medicare eligible population by the end of year 2017.

Duke University School of Medicine (Durham, N.C.)

A NICU discrete event simulation model

Duke’s neonatal clinicians care for more than 800 babies each year in the Duke Neonatal Intensive Care Unit (NICU). Although the majority do well, about 40 babies do not survive. How could they improve outcomes and save lives? Duke’s neonatal research team partnered with analytics company SAS to create an analytics-based model of Duke Children’s Hospital’s Level IV neonatal intensive care unit. The result was the creation of a discrete event simulation model that closely resembled the clinical outcomes of Duke’s training unit, which was validated using data held back from the original model, which also closely tracked actual unit outcomes.

The model uses a vast resource of clinical data to simulate the experience of patients, their conditions and staff responses in a computerized environment. It creates virtual babies experiencing care within a simulated NICU environment, including virtual beds staffed by virtual nurses. The research team attests that they cannot find any evidence of discrete event simulation modeling being used in a NICU setting, making this a first in neonatal care.

Houston Methodist (Houston, Texas)

A coordinated care/Medicare Shared Savings Program (MSSP) initiative

Houston Methodist’s MSSP program, Houston Methodist Coordinated Care (HMCC), can track and report Medicare patients’ healthcare visits and medical details. The successful execution of the program is a layering of technologies with the foundation being the organization’s integrated EHR platform and a separate population management tool.

The project was centered around six core elements: 1) becoming the first ACO (accountable care organization) in Texas to acquire real-time admission, discharge and transfer (ADT) notification capability that links all health providers; 2) chronic heart failure home monitoring; 3) real-time notification when HMCC patients came into the ED; 4) risk assessments for emergency room visits, hospital readmissions and the need for complex care; 5) same-day appointment facilitation; and 6) care team alerts. In sum, there were 17,000 Houston Methodist patients in HMCC in 2017, year-to-date, with 105 participating physicians. Total healthcare cost savings year-to-date are more than $1.3 million, according to officials.

Indiana University Health (Indianapolis, Ind.)

FHIR HIEdrant: making big data actionable at the point of care

One of the difficult challenges for many HIEs (health information exchanges) is the time and effort that it takes to reach out to a second system to search for needed data at the point of care. As such, the goal at IU Health was to develop an application within the clinical workflow that will, at the click of a single button, bring back data to that workflow relating to the patient’s chief complaint from the HIE.

The first phase of this project was building the framework and the mechanisms to make this a possibility and apply it to a single context: an emergency department patient with chest pain. Leaders at IU Health are utilizing the Fast Healthcare Interoperability Resources (FHIR) standard to communicate out from the IU Health Cerner EHR to the HIE to retrieve five specific data elements that are germane to caring for a chest pain patient in the emergency department and understanding their risk. Within the workflow, the clinician is being presented the most recent: ECG, cardiology note, discharge summary, catheterization report, and more. According to IU Health officials, this is the first FHIR-based application that directly accesses an HIE and delivers context-specific data about a patient directly to the clinical workflow.

Johns Hopkins Health System (Baltimore, Md.)

inHealth precision medicine initiative

The precision medicine initiative at Johns Hopkins Medicine and University–inHealth–seeks to improve individual and population health outcomes through scientific advances at the intersection of biomedical research and data science. Through a collaboration of The Johns Hopkins Applied Physics Laboratory (APL), and Johns Hopkins Medicine (JHM), inHealth is building a big-data precision medicine platform with the goal of accelerating the translation of insight into care delivery.

The first result of this broad, multidisciplinary effort was the successful creation of two Precision Medicine Centers of Excellence (PMCoE) focused on multiple sclerosis and prostate active surveillance. The organization’s Technology Innovation Center has developed applications to garner new data and learnings from clinical practice and feedback into discovery. Physicians have begun using the discovery platform to facilitate conversations with their patients about their treatment options and risks. The experiences of these centers will lead the next wave of PMCoEs, expanding the utility of the platform.

Lakeland Health (St. Joseph, Mich.)

Something wicked this way comes

Leaders at Lakeland Health set three core cybersecurity goals: (a) put risk management and cybersecurity near the top of health system leadership agenda; (b) use innovative strategies and tools to execute the cybersecurity program; and (c) shift focus from fear to clinical integrity. The cybersecurity program covered the hospitals, clinics, home care, hospice and all the different legal entities which comprised the health system. In order to ensure strategic direction and alignment, a steering committee was set up which met every two weeks.

The cybersecurity program execution was focused on three work-streams—process, technology and team members. In the process work stream, execution covered implementation and audit of policies and procedures, risk assessment and HIPAA (Health Insurance Portability and Accountability Act) compliance, and a monthly information security executive dashboard which was reviewed by the steering committee. Despite this continuing threat, the cybersecurity program delivered strong results in different areas, including: more than 100 business associate agreements (BAA) were signed; annual HIPAA risk assessment and remediation plans were put in place; the initial internal phishing campaign eventually lowered the click rate to 10 percent; there was a five-fold increase in the suspicious emails forwarded to the security team; and more than 1,000 laptops were encrypted.

Lexington Clinic (Lexington, Ky.)

Development of a direct-to-employer network

Costs of certain services often vary dramatically between providers, so by selectively designing benefits to increase cost-sharing at providers who provide more expensive care, enrollees are incentivized to see the more efficient providers who provide care at a lower cost, reducing average overall expenditure. Savings can then be passed on to the employer. In this project, implementation was examined with an organization with a self-funded insurance model. Steering beneficiaries toward a tighter network of providers resulted in significant overall reductions in expenditure while improving the health of the overall employee population. Rather than limiting their employee health plan to a lower percentage of area providers like most similar plan designs, the employer entered into a direct-to-employer program with a local, multispecialty physician group: Lexington Clinic.

A key component of a direct-to-employer plan is population health. Lexington Clinic was able to utilize analytics software to deliver value to the employer by implementing high cost/high utilization analysis, undetected chronic disease engagement, and ancillary modality management. Lexington Clinic also determined that there were specific interventions that could be made at critical junctures in the care continuum of the employee population. These interventions would be designed to prevent health issues before they arise, reducing future expenditures and worsened health outcomes. Via the Lexington Clinic premier network, the employer demonstrated a clear reduction in aggregate expenditure from the 2015 to 2016 time period of more than 4 percent.

Lutheran Medical Center (Wheat Ridge, Colo.)

An app for staff engagement

At Lutheran Medical Center, it became a priority to redesign the way in which the staff was engaged. The organization started to use an anonymous crowdsourcing platform in 2016 with the goal to create recipes for success that would help leaders in the organization ask the right questions through an anonymous tool to enhance engagement. Using the tool has established a venue for staff to engage in problem solving and design ideas on their own terms in an anonymous way where all can follow along in the conversation in real time. The application/website started its use in the pharmacy department as a means to understand low engagement scores. This tool allowed for all staff to be involved while not taking them away from their daily duties.

Lutheran Medical Center was in the 11th percentile when it came to staff satisfaction only two years ago, but now ranks in the 43rd percentile compared to the national average. Having the ability to get staff buy-in before a change happens has been critical in impacting staff satisfaction. Before, only those invited to certain meetings had the opportunity to voice their opinions; now everyone can be reached with a single email or app use. It has been used for solving several clinical problems as well, such as how to design a “cord-free” patient room, and how to transport oxygen tanks around the hospital.

Mercy (St. Louis, Mo.)

Using NLP for heart failure EHR documentation

The goal of this project was to use NLP to extract key cardiology measures from physician and other clinical notes and incorporate the results into a dataset with discrete data fields. This dataset would then be used to obtain actionable information and contribute to the evaluation of outcomes of medical devices in heart failure patients.

Three key measures that are commonly stored in clinical notes and not available in discrete fields include ejection fraction measurement; patient symptoms including dyspnea, fatigue, dizziness, palpitations, edema and pulmonary congestion; and the New York Heart Association (NYHA) heart failure classification. Mercy patients had 35.5 million clinical notes from both inpatient and outpatient encounters that were extracted, processed and then loaded onto an NLP server. NLP queries were developed by a team of Mercy data scientists to search for relevant linguistic patterns and then evaluated for both precision and recall. The use of NLP in this project facilitated the extraction of vital patient information that is not available in any discrete field in the EHR. Without the ability to track the changes in these three essential measures, it becomes much harder to identify the point of disease progression which is a crucial factor for the evaluation of current treatments and could inform future interventions, according to Mercy officials.

Mosaic Life Care (St. Joseph, Mo.)

Revenue management analytics dashboard

Mosaic Life Care provides healthcare and life care services in and around St. Joseph, Missouri and the Kansas City Northland area. The organization’s finance and revenue cycle teams faced challenges with data silos that required caregivers to manually obtain information from disparate systems and manually collate information, subjecting the process to human error, inconsistent processes and concerns about data accuracy.

With the goal of developing a flexible “source of truth” dashboard, the enterprise data warehouse team developed an integrated revenue management analytics solution with a front-end dashboard by leveraging the core EDW solution and architecture platform to extract data from the best of breed systems. Through the new dashboard, financial analysts and management teams can perform analysis and predict future trends. As a result, the dashboard enables real-time, data-driven business decisions inclusive of multi-disparate systems within a single unified platform.

NYU Langone Health (New York City, N.Y.)

Value-based medicine to improve clinical care

The goal of the project was to leverage health IT tools and related workflows to improve the value of inpatient care. Finance collaborated with the project’s physician champions to identify variations in care both internally and compared to benchmarked external institutions. The project’s physician champions collaborated with IT physician informaticists and IT project teams to design interventions to both reduce cost and improve clinical care.

The suite of interventions included: electronic clinical pathways; blood protocols; intravenous (IV) to oral (PO) medication changes; and lab ordering enhancements. Electronic pathways were created for heart failure, colon surgery, and pneumonia, and blood ordering clinical decision support and analytics were built. These projects realized significant two-year savings, including: electronic clinical pathways: $12.9 million; lab modifications: $3 million; blood utilization: $2.9 million; and IV to PO: $2.2 million.

Penn Medicine (Philadelphia, Pa.)

Standard clinical iPhone effectively enhances patient care

In January 2016, Penn Medicine met with Apple engineers to develop an economic and efficient full configuration Standard Clinical iPhone (SCiP) to work with Penn’s mobile device management tool while leveraging Apple’s Device Enrollment Program (DEP) and Volume Purchasing Program (VPP). Using this method saved the organization 975 man hours in its initial deployment using DEP streamlined setup (15 minutes versus one hour for each iPhone). Pushing additional apps to devices without needing an Apple ID and password for download or manual touch also made the implementation efficient. By implementing this project, it placed a vital tool into caregivers’ hands, officials say.

Just by using the secure texting, clinicians were able to coordinate patient flow across care settings with multiple providers and mended gaps in communication. One example had the cardiac surgery team on a thread with the patient’s current caregivers. The nurse took a picture of the surgical site and sent it securely with a description, concerned about swelling around the surgical site. The surgical team was able to provide immediate feedback and resolve the issue remotely.

A cybersecurity expert explains how to fight Russian election meddling – Vox

There is little room for doubt that Russia interfered in the 2016 election. The Justice Department on Friday handed down indictments to 13 Russian people and three Russian companies for meddling in United States political and election processes, the latest item in a litany of evidence that Russia, well, did it.

Even scarier, there is every indication that Russia is likely to try to interfere in the American political process again — and many of the technologies, trends, and processes it exploited in the past are largely unchanged. (Catch that New York Times story on the Twitter bot factories?)

“I’ll tell you right up front, it is going to happen again,” Greg Touhill, a retired Air Force general officer and one of the nation’s premier cybersecurity experts, told me. Touhill is currently president of Cyxtera Federal Group, a secure infrastructure company. Before that, he served in a wide range of government roles, including as the first United States chief information security officer in 2016.

I spoke with Touhill about what the United States can do to stop Russia from interfering in US politics and elections in 2018 and beyond. While the federal government certainly has a major role to play — in deterring future interference, in supporting state and local election officials, and in boosting national security efforts — Touhill noted that the technology companies Russians use as a conduit in their disinformation campaign have a responsibility as well.

So do everyday Americans, in using good judgment when they’re reading news sources: “If it sounds phony, it probably is,” he said.

This interview has been edited and condensed for clarity.

Emily Stewart

We keep getting more details about Russian meddling in the 2016 election, including Friday’s indictments, and we’re also seeing warnings that Russians are likely to try something again in 2018. What can and should the federal government and other entities be doing so that we don’t see this happen again?

Greg Touhill

I’ll tell you right up front, it is going to happen again. It’s happened before, and frankly, it’s happened throughout all of time. A different way to phrase it is how do we prepare ourselves to deal with this when it happens again? And how do we mitigate it and the like?

Information operations, influence operations, or whatever you want to call it — and different nations call it different things — people have recognized, as Francis Bacon used to say, knowledge is power. They’re constantly trying to seek the ability to influence and get knowledge and get an information advantage. From my perch, I think that we want to deter further action, we want to mitigate it when it does happen, and we want to take action that’s effective and proportionate when we do detect that somebody is breaking international norms.

Emily Stewart

How do you balance deterrence of future action against retaliation or punishment of past action? How would you approach it?

Greg Touhill

If you take a look at all the different instruments of power that are available to the United States, we have the military option, which as a retired officer I think should be the last resort, but certainly it should be on the table for consideration, particularly when it comes to deterrence. We also have the political, the economic, and the diplomatic means as well.

First things first is you have to — when you see somebody who is breaking norms and is engaged in things that we don’t believe as an international community are the right things to do — you need to confront that, and you need to present the evidence that says, “Hey, here is where you are breaking the norms.”

We have been working, from the United States government, on a very leadership, forward-thinking approach to cyber norms. That should be a priority in the international community, and the United States should take a continuous leadership role in making sure that we have a clear understanding and articulation of acceptable behavior in the cyber domain, and affirmation of the cyber norms that have been already proposed needs to be a priority for our diplomat efforts.

Secondly, when we see folks that are deviating from those norms, there needs to be some accountability, and that’s where we have the ability under our current legal framework to issue economic sanctions, diplomatic sanctions, and, in [Friday’s] case, legal indictments, where we are trying to hold individuals and states accountable for violating law and, as I mentioned, norms of acceptable behavior.

Emily Stewart

What agencies or entities within the government need to take the lead here?

Greg Touhill

Frankly, this is a whole of government issue. And as you take a look at all those instruments of national power, it’s distributed across departments and agencies. That’s a reason why in 1947 we established the National Security Council to help coordinate a lot of the activities dealing with national security.

I would submit that our national security and our national prosperity is intrinsically linked to cybersecurity and the integrity of information technology and the information that’s contained within it. You name me a business or an institution or a societal institution itself that doesn’t rely on IT right now, it’s very difficult. As we take a look at the roles across the federal government — the Department of State, the Department of Treasury, the Department of Homeland Security, the Department of Defense, the Department of Commerce, the Department of Justice — virtually every single major department and agency has a stake in those elements of national power that we could use and leverage to deal with issues of deterrence and proper response to cyberattacks.

The National Security Council, working under the National Command Authority, that’s where I’m looking for leadership to coordinate all instruments of national power.

Emily Stewart

What about the president? On Friday, the indictments come down, and he says, “No collusion!”

Greg Touhill

I don’t necessarily see the discussion of collusion being the same as to acknowledge that we have an issue with Russian-based actors engaged in influence operations against the United States. I took the collusion issue as a separate domestic issue as opposed to the actual influence operations.

I believe that the evidence we’ve seen thus far points toward Russian-based actors engaged in targeted influence operations directed against the people of the United States with what appears to be an ultimate goal to undermine democratic institutions in the United States.

Emily Stewart

Well, but Trump doesn’t seem hyper concerned about Russia; he seems to be downplaying it.

Greg Touhill

I don’t know President Trump, nor do I know his leadership style, so I really can’t comment on that.

It’s very possible, and I wouldn’t rule it out, that he has directed the National Security Council to provide him different options, and as you take a look at activities at [a] nation-state level, many of those deliberations are going to be held in very classified settings. At this point, I really can’t comment because I don’t know what he’s directing in the background. Nor would I expect, if it were President Obama or President Bush or President Clinton or any of his predecessors — this is really an important topic, and I’m confident that the National Security Council is in fact looking at all different options that would be on the table and advising the president as such.

Emily Stewart

Beyond the government and the president, what do companies like Facebook and Twitter, which seem to be a major part of what happened in 2016, need to be doing?

Greg Touhill

If you look at it through the lens of cybersecurity, I think there are three major lenses: people, process, and technology. You’re taking a look at all sorts of different media platforms that could include Twitter, Facebook, and the like, which under social media are powerful platforms. You want to make sure you get it right.

You want to make sure that your people are properly trained to maintain the integrity of product and information that you’re putting out. You want to make sure that you have the proper processes in place to properly vet input so that you, in fact, are not putting out, for lack of a better term, “fake news.” It’s almost like yelling, “Fire!” in a movie theater: You want to make sure that you are, in fact, accurate and that your product is trusted. You want to put in right technologies to make sure that you have positive control over that information that you’re sharing.

There are plenty of tools that are currently developed and being fielded right now that can help on the technology standpoint, and certainly training and processes are part of good order and discipline in any business these days. From a technology standpoint, you should not let anybody have access to your information or equipment or systems and the like.

Having positive control over the platforms themselves is critically important. Technologies such as software-defined perimeters that are identity-centric and really go down and validate authorities and identities prior to connecting and doing authorization first and connection second as a technology is critically important. As you see more and more companies that want to make sure they have positive control over their tech to protect the information inside it are switching to things like software-defined perimeters, regardless of what industry they’re in — finance, social media, etc.

I am heartened, though, by the rhetoric of some of the companies, where they’re coming out and saying, “Hey, we’re putting things in so people, if they see something, they can say something, question whether or not this is fake news.” That’s a step in the right direction, but I want to see more.

Emily Stewart

I’m interested in this question of whether social media companies need to know their customers. Banks are subject to know-your-customer and anti-money laundering laws; can’t technology companies be too? At the same time, with those sorts of regulations, you tend to hear protests on the First Amendment front — namely, shouldn’t people be able to say whatever they want, presumably, on Twitter, even if it is a bot?

Greg Touhill

That’s gets back to yelling, “Fire!” in a movie theater. There was a great debate about 100 years ago as to First Amendment rights. Do you have the right to yell, “Fire!” in the movie theater if public safety is at risk? If we take a look at different companies that are out there, do they in fact have the code of ethics to make sure the information presented is in fact proper?

Google, what’s their theme? Do no harm, right? If Google is serving up info that may in fact be harmful, is that contrary to their own ethics? It’s a heavy issue, and I’m not necessarily a philosopher, but professor Touhill would tell you that you’ve got a great capability, and technology doesn’t always solve every problem. Leadership is needed at all levels, including in the technology areas to try to combat this problem.

And as I also tell my mother, you need to not draw conclusions from a single news source; you need to go survey the whole landscape. I believe that freedom of the press here in the United States is one of our greatest strengths, and I expect the press to do their bit too, to make sure that when they’re seeing fake news they’re pulling it out so that we can, in fact, all work together as a team, as a people, to make sure that the general population gets the right news, the truth. That’s what we’re all looking for. It’s more than just technology.

Emily Stewart

Along those lines, beyond the government, tech companies, the press, what about me, sitting at home on my computer? Is there some role citizens need to play in this in being smarter in the way that they consume news and information?

Greg Touhill

There are some very straightforward things that every citizen can and should be doing.

One is don’t believe everything you see online. Do your homework, go check multiple sources, make sure that you are staying away from suspicious websites, go to news sources that are trusted and maintain that same level of integrity as you would hope that you would be promoting yourself. You want to get your news from folks who will double-check and triple-check their sources, that are unimpeachable, that recognize their responsibility. And if it’s coming from a news source that you don’t know, then it’s probably not necessarily a trusted source. That’s the first thing.

Second thing, follow the advice I gave my mother — get your news from multiple sources. There’s more than one network on TV, and there’s more than one newspaper online. The great news organizations have at their core the same story, but they give you different analyses, different perspectives. If you want to be better educated into the news, you’re better served by understanding those different perspectives. Make sure that you’re doing your homework and not necessarily going to just one news source.

Third, if it sounds phony, it probably is. Dig deeper when you see things that seem outrageous. You may find that things that are particularly outrageous, if it’s not coming from a trusted news source, it’s probably is made up.

Emily Stewart

In wrapping up, going forward, just looking at the next six months, if you could pick out three things that the federal government could do to safeguard election integrity, what do you think they should do?

Greg Touhill

Number one, work with state governments — state, local, county, tribal, territorial governments — because all elections are managed locally. The federal government does not go out and do voter registration; the federal government does not do the collection of votes, and the federal government does not do the tabulation of votes. That’s all done locally and up to the state level.

Its’ really important for the federal government to work with the states and the counties to make sure they are hardened. I mentioned those three processes — voter registration, the actual casting of the ballot, and the actual tabulation, counting the votes — three individual processes that are all critical.

That’s all done at the state level, [but] the federal government can assist the states on that. They can assist with best practices, and having been director of the NCCIC for a while — that’s the National Cybersecurity and Communications Integration Center — which has the US-CERT and the ICS-CERT, the industrial control systems certification, we went out and reached out to the secretaries of state in different states and offered assistance.

There’s a lot of discussion right now as to how the states want to use the capabilities and best practices and the like, but I think that’s something that still needs to be at the top of the agenda at the state level as well as within the Department of Homeland Security to help.

Two, from an influence operations standpoint, we have to do counter influence operations, and I think we’ve already started a lot of that. We need to make sure that the American people understand that there are influence operations that are, in fact, being conducted against us, and the media has been really good as of late, for example, highlighting the fact that we had the major intelligence leaders testifying before Congress this past week, raising that alert.

The next step is for the federal government to actually have a plan on how to educate and inform citizens as to, “What do I need to do in an environment where influence operations are ongoing?” That’s going to be very difficult for the United States government to do given the fact that we cherish freedom of the press and our First Amendment, but we do need to make sure that we have an educated and informed populace.

The third thing that the federal government should be doing, in my opinion, is be[ing] very clear from a deterrence standpoint what the consequences would be for any entity that is trying to interfere with our free and open democratic processes. There should be accountability. There should be activity leveraging diplomatic and other instruments of national power to deter any entity from attacking our most cherished democratic institutions.