Serious quantum computers are finally here. What are we going to do with them? – MIT Technology Review

Inside a small laboratory in lush countryside about 50 miles north of New York City, an elaborate tangle of tubes and electronics dangles from the ceiling. This mess of equipment is a computer. Not just any computer, but one on the verge of passing what may, perhaps, go down as one of the most important milestones in the history of the field.

Quantum computers promise to run calculations far beyond the reach of any conventional supercomputer. They might revolutionize the discovery of new materials by making it possible to simulate the behavior of matter down to the atomic level. Or they could upend cryptography and security by cracking otherwise invincible codes. There is even hope they will supercharge artificial intelligence by crunching through data more efficiently.

custom-entrysetcustom-entrysetcustom-entrysetcustom-entrysetcustom-entryset

Yet only now, after decades of gradual progress, are researchers finally close to building quantum computers powerful enough to do things that conventional computers cannot. It’s a landmark somewhat theatrically dubbed “quantum supremacy.” Google has been leading the charge toward this milestone, while Intel and Microsoft also have significant quantum efforts. And then there are well-funded startups including Rigetti Computing, IonQ, and Quantum Circuits.


This story is part of our March/April 2018 Issue

No other contender can match IBM’s pedigree in this area, though. Starting 50 years ago, the company produced advances in materials science that laid the foundations for the computer revolution. Which is why, last October, I found myself at IBM’s Thomas J. Watson Research Center to try to answer these questions: What, if anything, will a quantum computer be good for? And can a practical, reliable one even be built?

Why we think we need a quantum computer

The research center, located in Yorktown Heights, looks a bit like a flying saucer as imagined in 1961. It was designed by the neo-futurist architect Eero Saarinen and built during IBM’s heyday as a maker of large mainframe business machines. IBM was the world’s largest computer company, and within a decade of the research center’s construction it had become the world’s fifth-largest company of any kind, just behind Ford and General Electric.

While the hallways of the building look out onto the countryside, the design is such that none of the offices inside have any windows. It was in one of these cloistered rooms that I met Charles Bennett. Now in his 70s, he has large white sideburns, wears black socks with sandals, and even sports a pocket protector with pens in it. Surrounded by old computer monitors, chemistry models, and, curiously, a small disco ball, he recalled the birth of quantum computing as if it were yesterday.

Charles Bennett of IBM Research is one of the founding fathers of quantum information theory. His work at IBM helped create a theoretical foundation for quantum computing.

bartek sadowski

When Bennett joined IBM in 1972, quantum physics was already half a century old, but computing still relied on classical physics and the mathematical theory of information that Claude Shannon had developed at MIT in the 1950s. It was Shannon who defined the quantity of information in terms of the number of “bits” (a term he popularized but did not coin) required to store it. Those bits, the 0s and 1s of binary code, are the basis of all conventional computing.

A year after arriving at Yorktown Heights, Bennett helped lay the foundation for a quantum information theory that would challenge all that. It relies on exploiting the peculiar behavior of objects at the atomic scale. At that size, a particle can exist “superposed” in many states (e.g., many different positions) at once. Two particles can also exhibit “entanglement,” so that changing the state of one may instantaneously affect the other.

Bennett and others realized that some kinds of computations that are exponentially time consuming, or even impossible, could be efficiently performed with the help of quantum phenomena. A quantum computer would store information in quantum bits, or qubits. Qubits can exist in superpositions of 1 and 0, and entanglement and a trick called interference can be used to find the solution to a computation over an exponentially large number of states. It’s annoyingly hard to compare quantum and classical computers, but roughly speaking, a quantum computer with just a few hundred qubits would be able to perform more calculations simultaneously than there are atoms in the known universe.

In the summer of 1981, IBM and MIT organized a landmark event called the First Conference on the Physics of Computation. It took place at Endicott House, a French-style mansion not far from the MIT campus.

In a photo that Bennett took during the conference, several of the most influential figures from the history of computing and quantum physics can be seen on the lawn, including Konrad Zuse, who developed the first programmable computer, and Richard Feynman, an important contributor to quantum theory. Feynman gave the conference’s keynote speech, in which he raised the idea of computing using quantum effects. “The biggest boost quantum information theory got was from Feynman,” Bennett told me. “He said, ‘Nature is quantum, goddamn it! So if we want to simulate it, we need a quantum computer.’”

IBM’s quantum computer—one of the most promising in existence—is located just down the hall from Bennett’s office. The machine is designed to create and manipulate the essential element in a quantum computer: the qubits that store information.

This lab at IBM houses quantum machines connected to the cloud.

jeremy liebman

The gap between the dream and the reality



Th
e IBM machine exploits quantum phenomena that occur in superconducting materials. For instance, sometimes current will flow clockwise and counterclockwise at the same time. IBM’s computer usessuperconducting circuits in which two distinct electromagnetic energy states make up a qubit.

The superconducting approach has key advantages. The hardware can be made using well-established manufacturing methods, and a conventional computer can be used to control the system. The qubits in a superconducting circuit are also easier to manipulate and less delicate than individual photons or ions.

Inside IBM’s quantum lab, engineers are working on a version of the computer with 50 qubits. You can run a simulation of a simple quantum computer on a normal computer, but at around 50 qubits it becomes nearly impossible. That means IBM is theoretically approaching the point where a quantum computer can solve problems a classical computer cannot: in other words, quantum supremacy.

But as IBM’s researchers will tell you, quantum supremacy is an elusive concept. You would need all 50 qubits to work perfectly, when in reality quantum computers are beset by errors that need to be corrected for. It is also devilishly difficult to maintain qubits for any length of time; they tend to “decohere,” or lose their delicate quantum nature, much as a smoke ring breaks up at the slightest air current. And the more qubits, the harder both challenges become.

“If you had 50 or 100 qubits and they really worked well enough, and were fully error-corrected—you could do unfathomable calculations that can’t be replicated on any classical machine, now or ever,” says Robert Schoelkopf, a Yale professor and founder of a company called Quantum Circuits. “The flip side to quantum computing is that there are exponential ways for it to go wrong.”

The chips inside IBM’s quantum computer (at bottom) are cooled to 15 millikelvin.

jeremy liebman

Another reason for caution is that it isn’t obvious how useful even a perfectly functioning quantum computer would be. It doesn’t simply speed up any task you throw at it; in fact, for many calculations, it would actually be slower than classical machines. Only a handful of algorithms have so far been devised where a quantum computer would clearly have an edge. And even for those, that edge might be short-lived. The most famous quantum algorithm, developed by Peter Shor at MIT, is for finding the prime factors of an integer. Many common cryptographic schemes rely on the fact that this is hard for a conventional computer to do. But cryptography could adapt, creating new kinds of codes that don’t rely on factorization.

This is why, even as they near the 50-qubit milestone, IBM’s own researchers are keen to dispel the hype around it. At a table in the hallway that looks out onto the lush lawn outside, I encountered Jay Gambetta, a tall, easygoing Australian who researches quantum algorithms and potential applications for IBM’s hardware. “We’re at this unique stage,” he said, choosing his words with care. “We have this device that is more complicated than you can simulate on a classical computer, but it’s not yet controllable to the precision that you could do the algorithms you know how to do.”

What gives the IBMers hope is that even an imperfect quantum computer might still be a useful one.

Gambetta and other researchers have zeroed in on an application that Feynman envisioned back in 1981. Chemical reactions and the properties of materials are determined by the interactions between atoms and molecules. Those interactions are governed by quantum phenomena. A quantum computer can—at least in theory—model those in a way a conventional one cannot.

Last year, Gambetta and colleagues at IBM used a seven-qubit machine to simulate the precise structure of beryllium hydride. At just three atoms, it is the most complex molecule ever modeled with a quantum system. Ultimately, researchers might use quantum computers to design more efficient solar cells, more effective drugs, or catalysts that turn sunlight into clean fuels.

Those goals are a long way off. But, Gambetta says, it may be possible to get valuable results from an error-prone quantum machine paired with a classical computer.

From a physicist’s dream to an engineer’s nightmare



“The thing driving the hype is the realization that quantum computing is actually real,” says Isaac Chuang, a lean, soft-spoken MIT professor. “It is no longer a physicist’s dream—it is an engineer’s nightmare.”

Chuang led the development of some of the earliest quantum computers, working at IBM in Almaden, California, during the late 1990s and early 2000s. Though he is no longer working on them, he thinks we are at the beginning of something very big—that quantum computing will eventually even play a role in artificial intelligence.

But he also suspects that the revolution will not really begin until a new generation of students and hackers get to play with practical machines. Quantum computers require not just different programming languages but a fundamentally different way of thinking about what programming is. As Gambetta puts it: “We don’t really know what the equivalent of ‘Hello, world’ is on a quantum computer.”

We are beginning to find out. In 2016 IBM connected a small quantum computer to the cloud. Using a programming tool kit called QISKit, you can run simple programs on it; thousands of people, from academic researchers to schoolkids, have built QISKit programs that run basic quantum algorithms. Now Google and other companies are also putting their nascent quantum computers online. You can’t do much with them, but at least they give people outside the leading labs a taste of what may be coming.

The startup community is also getting excited. A short while after seeing IBM’s quantum computer, I went to the University of Toronto’s business school to sit in on a pitch competition for quantum startups. Teams of entrepreneurs nervously got up and presented their ideas to a group of professors and investors. One company hoped to use quantum computers to model the financial markets. Another planned to have them design new proteins. Yet another wanted to build more advanced AI systems. What went unacknowledged in the room was that each team was proposing a business built on a technology so revolutionary that it barely exists. Few seemed daunted by that fact.

This enthusiasm could sour if the first quantum computers are slow to find a practical use. The best guess from those who truly know the difficulties—people like Bennett and Chuang—is that the first useful machines are still several years away. And that’s assuming the problem of managing and manipulating a large collection of qubits won’t ultimately prove intractable.

Still, the experts hold out hope. When I asked him what the world might be like when my two-year-old son grows up, Chuang, who learned to use computers by playing with microchips, responded with a grin. “Maybe your kid will have a kit for building a quantum computer,” he said.

Keep up with the latest in AI chips at EmTech Digital.

The Countdown has begun.

March 25-26, 2019

San Francisco, CA

Register now

DOST: Advancing science, technology agenda best option for PHL growth – Business Mirror

Last updated on February 21st, 2018 at 09:10 pm

IN a recent meeting with members of the Makati Business Club and several foreign chambers of commerce, the government’s chief scientist Fortunato dela Peña encouraged local and foreign businessmen to invest in technology-related enterprises. Dela Peña, Secretary of the Department of Science and Technology (DOST), said this is relevant as the government is now investing heavily in science and technology.

The chief scientist cited advancement in health and medicine development with the county’s numerous traditional medicinal herbs as focus, education, energy, disaster resiliency, and climate-change adaptation, including enterprises that deal with creativity such as designs.

A civil servant for two decades now holding various teaching and civil-service positions up to his appointment as top official of the DOST under the present administration, Dela Peña said that for a time, the country seemed to have grown “resistant” to science- and technology-related endeavors, although a core number of advocates persisted in pushing the science agenda.

The progressive minds seemed to have prevailed, the DOST official said.

Recently, the Philippines ranked 73rd out of 128 economies in terms of Science and Technology and Innovation (STI) index, citing the country’s strength in research and commercialization of STI ideas. The report also said that 60 percent of companies in the country offer training to improve the technical skill of their employees.

Investment

HOWEVER, a study by the Philippine Institute for Development Studies highlighted the weak ties between innovation-driven firms and the government, and it also identified the country’s low expenditure in research and development (R&D).

According to Dela Peña, this is the reason the government is now extending all its efforts to reach out with the private sector, explaining that STI plays an important role in economic and social progress and is a key driver for a long-term growth of an economy.

Technology adoption, the official said, allows a country’s firms and citizens to benefit from innovations created in other countries, and allows it to catch up and even leap-frog obsolete technologies.

This can lead to significant improvements in productivity for existing firms in agriculture, industry and services.

For one, long-term investments in building local capacity for technology generation can lead to innovations that will give local firms a competitive advantage. This can result in the creation of new firms and even entirely new industries.

For another, the local medicine sector has been showing potential, the DOST official said, citing the case of two dozen local herbs or medicinal plants being studied as one example.

Lagundi

WHEN asked about the case of Lagundi (Vitex negundo), whose efficacy as medicine is being challenged by drug manufacturers, DOST Assistant Secretary for International Cooperation Leah J. Buendia said that the shrub was subjected to 20 years of stringent clinical trials and has been proven consistently as effective.

“But since the DOST, and the government for that matter, is not into commercialization, private companies are the ones who manufacture the component of the medicinal plant into commercially available medicines,” Buendia said, adding that the agency only gives the results which include the right formula and volume of the medicinal component.

Asked on the possibility that private manufacturers might knowingly dilute the required strength for the medicine to be effective to cut cost and unwittingly made the medicine commercially available ineffective, the official declined to comment.

She, however, assured that the private sector is working with the agency for the purpose of commercializing discoveries made and studied by the DOST in partnership with the private sector, as these “products would not help private companies profit but advance the country’s agenda.”

The science agenda that, despite advances, is still in need of prioritization and more funding. This agenda is in the Philippine Development Plan 2017-2022, which devotes an entire chapter on STI.

STI culture

RECENT positive developments and advancement in science and technology notwithstanding, there remains a low level of innovation in the country. This is brought by weaknesses in STI human capital, low R&D expenditure and weak linkages in the STI ecosystem.

In the Global Innovation Index (GII) Report last year, the Philippines ranked 74th among 128 economies in terms of overall innovation, garnering a score of 31.8 out of 100. This is a slight improvement from the score of 31.1, ranking 83rd out of 141 economies in 2015.

The country also ranked fifth out of seven members of the Association of Southeast Asian Nations (Asean) that were included in the survey. The Philippines was ahead of Cambodia (95th) and Indonesia (88th) but lagged behind Singapore (6th), Malaysia (35th), Thailand (52nd) and Vietnam (59th).

The factors behind the weak performance of the STI sector include a weak STI culture, Dela Peña said.

There is lack of public awareness and interest in STI and many sectors do not recognize, appreciate and understand the use of technology- and science-based information in their daily activities.

There’s also a number of weaknesses in social and professional culture, i.e., research culture in universities, commercialization of results from public research, among others. According to Dela Peña, a lack of awareness on intellectual property rights, in the research community and the general public, still persists.

Despite its availability, adoption and application of technologies among micro, small and medium enterprises (MSMEs) and sectors like agriculture and fisheries remains low, he added.

Research

LOW government and private spending on STI is another factor behind the weak performance of the STI sector, according to the GII report.

Investments in R&D are central for enhancing the country’s innovation ecosystem, the report said. Expenditures on R&D and innovation activities, as well as the support given to the development of human resources in various fields of science and technology (S&T), are the parameters scrutinized in the monitoring and evaluation of STI.

While nominal R&D expenditures increased by 80 percent to P15.92 billion in 2013, the proportion of R&D spending to GDP stood at only 0.14 percent. This is substantially below the 1-percent benchmark recommended by the United Nations Educational, Scientific and Cultural Organization (Unesco) and the global average of 2.04 percent. It is also low compared to other Asean countries, such as Vietnam, 0.19 percent, Thailand with 0.36 percent, Malaysia with 1.09 and Singapore’s 2.0 percent. The data is available online from Unesco’s Institute for Statistics.

The country’s relatively low ranking in the GII Report was pulled down by weaknesses in human capital and R&D, with a score of 22.7 out of 100, ranking 95th. This is due to the low public and private expenditures on education and R&D, as well as low tertiary inbound mobility. Tertiary inbound mobility refers to the number of students from abroad studying in a given country, as a percentage of the total tertiary or college enrollment.

The bulk of the R&D spending, about 60 percent, comes from the public sector. These were directed to agricultural and industrial production and technology, protection and improvement of human health, control and care of the environment, among others. Most of the R&D activities in the country are still concentrated in the National Capital Region, Calabarzon and Central Luzon.

Manpower

ANOTHER indicator measuring the capacity for technology generation is the number of S&T human resources engaged in R&D.

As of 2013, the country has a total of 36,517 R&D personnel, of which 26,495 are key researchers, scientific, technological and engineering personnel engaged in R&D; the rest are technicians and support personnel.

The figures denote that there are only 270 researchers for every one million Filipinos. Such ratio falls short of the Unesco norm of 380 per million population and the 1,020 researchers per million population average across developing economies of East Asia and the Pacific.

Of the total researchers in the country from the government, higher educational institutions (HEIs) and private nonprofit sectors, 14 percent had doctoral degrees (PhD), 38 percent had master’s degrees, while 34 percent had Bachelor of Science (BS) up to post-BS degrees. The low number of researchers in the country reflects the propensity of the educational system in the country to produce graduates outside of science, technology, engineering and mathematics, or Stem, programs—the disciplines where R&D flourishes. Nevertheless, the latest GII report indicates that in terms of graduates in science and engineering, the country garnered a score of 25.5 out of 100, ranking 26th.

Capital

AN assessment of the country’s innovation system conducted by a program of the United States Agency for International Development (Usaid) reveals that the supply of Stem graduates exceeds local demand.

As a result, there is an out-migration and, worse, underemployment of many skilled, locally trained scientists and engineers. The report by the Usaid’s Science, Technology, Research and Innovation for Development, or Stride, program also cited a shortage in training for fields critical for innovation, particularly in information technology. Such shortage contributes to the challenge that many local companies face, especially in securing employees with the skills required to grow the business.

This somewhat explains the nature of brain drain the country has. It is not so much because of Filipinos not being “nationalistic” but simply because there is limited opportunity for people of science to stay in the country.

However, Buendia said the issue of nationalism has some credence, if not the absolute answer, citing the case of South Korea in the 1950s.

When South Korea was in its lowest in terms of economic level, the government called on all its scientists and engineers scattered around the world to go home and help build their economy, and many responded, she said.

According to Buendia, this brain drain contributes to the problem as potential researchers, scientists and engineers, the key actors for the innovation ecosystem to flourish, prefer to seek employment overseas due to better economic opportunities and potential for advancement. Since knowledge and technology are mostly embodied in human resources, this emphasizes the urgency to accelerate the development of R&D human resource.

Patents

THE output of R&D is commonly measured in terms of patents applied and granted to Filipino residents.

However, reports show that many universities do not have the expertise to market their patent portfolios for commercial use. Furthermore, technology generators face persisting issues on technology ownership while researchers are constrained by the “publish or perish” phenomenon.

This results in the weak technology transfer system in the country.

An annual average of 209 patent utility models and 597 industrial design applications were filed from 2005 to 2015. In the same period, an annual average of 54 patents, 446 utility models and 502 industrial designs were granted.

In 2016, the World Economic Forum (WEF) ranked the Philippines 86th out of 128 economies for the number of patents filed under the Patent Cooperation Treaty per million population. Invention patents granted to local inventors represent the smallest share in number of intellectual properties granted from 2001 to 2003. Industrial design and utility models consistently comprise the majority of the intellectual property granted.

The country also needs to catch up in research publications since the number of scientific publication in peer-reviewed journals per million population stands at 55, substantially below that of Asean member-states like Singapore with its staggering 10,368, Malaysia with 1,484, Thailand with 478 and Vietnam with 105.

Ecosystem

ANOTHER factor behind the weak performance of the STI sector is the weak linkages among players in the STI ecosystem.

The 2009 survey of Innovation Activities and the 2014 Usaid-Stride Assessment of the Philippine Innovation Ecosystem discovered that innovation actors have weak cooperation, partnerships and trust among themselves. Most HEIs perceive collaboration with companies as outside of their core missions and a risk to exploitation.

Consequently, firms report that difficulties in convincing HEIs of their shared interests stem from resentment, suspicion and distrust. In effect, firms end up with little technical assistance from the government and research institutions.

Another factor in this equation is restrictive regulations that hamper implementation of R&D programs and projects.

The tedious government procurement process hobbles the immediate procurement of equipment and other needed materials for research, which, in turn, delays the implementation of R&D projects, the GII report said. This was confirmed by the Usaid-Stride study, which revealed that restrictive regulations make the procurement of equipment and consumables for research extremely slow and unnecessarily complex, decreasing research productivity, publication potential, and speed-to-market of innovation.

In addition, the report said the government research grants do not compensate universities for the salary of faculty members’ research activities. This practice is rarely seen outside the Philippines.

The final factor in the weak performance of the STI sector is inadequacy of an STI infrastructure that includes laboratory facilities, testing facilities and R&D centers.

Many existing hubs need upgrading to improve their services, which contributes to the lack of absorptive capacity in research institutions, the Usaid-Stride report said. It also cited that the public institutions failed to provide young researchers with equipment packages, particularly those returning from PhD studies abroad with more advanced research agendas.

The country’s leading research institutions also remain concentrated in Luzon.

Hopes

DESPITE the many inadequacies, from funding to human capital, there are some technology-intensive research and capacity-building projects which resulted in products which are currently being used successfully.

One is the micro-satellite.

In April 2016, the country launched into space its first micro-satellite called Diwata-1. It was designed, developed and assembled by Filipino researchers and engineers under the guidance of Japanese experts. The Diwata (deity in English) satellite provides real-time, high-resolution and multi-color infrared images for various applications, including meteorological imaging, crop and ocean productivity measurement and high-resolution imaging of natural and man-made features.

It enables a more precise estimate of the country’s agricultural production, provides images of watersheds and floodplains for a better understanding of water available for irrigation, power and domestic consumption. The satellite also provides accurate information on any disturbance and degradation of forest and upland areas.

The country also has the Nationwide Operational Assessment of Hazards (Noah), which uses the Lidar (light detection and ranging) technology. Project NOAH was initiated in June 2012 to help manage risks associated with natural hazards and disasters. The project developed hydromet sensors and high-resolution geo-hazard maps, which were generated by light detection and ranging technology for flood modeling.

Noah helps the government in providing timely warning with a lead time of at least six hours in the wake of impending floods.

According to Buendia, the country is now training the Cambodians on this technology, as part of the partnerships among Asean countries, just like in the case of Japan which assisted the country’s scientists and engineers in building its first micro-satellite.

Another hope lies in the so-called Intelligent Operation Center Platform.

Established through a collaboration between the local government of Davao City and IBM Philippines Inc., the center resulted in the creation of a dashboard that allows authorized government agencies, such as police, fire and anti-terrorism task force, to use analytics software for monitoring events and operations in real time.

Initiatives

THE DOST, in cooperation with HEIs and research institutions, established advanced facilities that seek to spur R&D activities and provide MSMEs access to testing services needed to increase their productivity and competitive advantage.

One is the Advanced Device and Materials Testing Laboratories. The center houses advanced equipment for failure analysis and materials characterization to address advanced analytical needs for quality control, materials identification and R&D. Closely related to this facility is the Electronics Products Development Center, used to design, develop and test hardware and software for electronic products.

There are also high-performance computing facilities that perform tests and run computationally intensive applications for numerical weather prediction, climate modeling, as well as analytics and data modeling and archiving.

The Philippines could also boast of its Genome Center, a core facility that combines basic and applied research for the development of health diagnostics, therapeutics, DNA forensics and preventive products, and improved crop varieties.

According to Buendia, the country also has drug-discovery facilities, which address the requirements for producing high-quality and globally acceptable drug candidates. She said the Philippines also has nanotechnology centers, which provide technical services and enabling environment for interdisciplinary and collaborative R&D in various nanotechnology applications.

Buendia said there are also radiation processing facilities that are used to degrade, graft, or crosslink polymers, monomers, or chemical compounds for industrial, agricultural, environmental and medical applications. The Philippines could also boast of its Die and Mold Solutions Center, which enhances the competitiveness of the local tool and die sector through the localization of currently imported dies and molds.

These reflect that we are advancing, albeit slowly, to a culture that embraces STI as a sure path to growth, according to Dela Peña.

Image Credits: LEREMY | DREAMSTIME.COM

Alladin S. Diega

Alladin S. Diega is working with BusinessMirror as a correspondent since 2013. Mr. Diega is currently covering the Metro Page. He studied BS Journalism at the Lyceum of the Philippines and has also worked with various non-government organizations. He is currently into breeding African Lovebirds as a hobby.

30 Incredibly Useful Things You Didn’t Know Slack Could Do – Fast Company

4. Save your place in a channel or direct message by holding down the Alt or Option key and then clicking any timestamp (or long-pressing on a timestamp from the mobile app and selecting “Mark unread”). The next time you open that thread on any device, you’ll be taken directly to the point you set.

5. Get where you need to go faster by letting Slack’s Quick Switcher beam you directly to any channel, direct message, or team—no clicking or navigation required. Just hit Ctrl-K or Cmd-K and type the first letter of your desired destination. You’ll see a list of options appear and can select the one you want or type additional letters to narrow it down further. (Bonus tip: If you’re using one of the Slack desktop apps, Ctrl-T or Cmd-T will do the same thing. Take your pick!)

6. Make it easy to find important messages or files by pinning them to a channel or private message. Click the three-dot menu icon above a message (or long-press it from the mobile app) and select “Pin to this conversation.” The item will then appear within a pushpin icon at the top of the thread, where everyone will see and be able to access it as needed.

Put a pin in something important so everyone can find it with ease.

7. Catch up on activity quickly by putting Slack’s All Unreads feature to use. First, make sure the feature is activated within the Sidebar section of the desktop app’s preferences. Then just look for the “All Unreads” line in the sidebar, and start there to see a single centralized list of everything you haven’t read.

8. When a channel gets too cluttered for you to focus, let Slack hide all the inline image previews—including, yes, any animated GIFs your Giphy-loving colleagues have summoned. Type “/collapse” to send ’em all a-packin’, then type “/expand” if and when you want everything back.

Get Your Message Across

9. You may know about Slack’s basic text-formatting commands—asterisks around text for bold, underscores for italics, but the app also has advanced options that’ll let you further transform the appearance of your words. To wit:

The Case Against Google – The New York Times

Standard Oil’s technological discoveries gave the company huge advantages over its rivals, and Rockefeller exploited those advantages ruthlessly. He cut secret deals with railroads so that other firms had to pay more for transportation. He forced smaller refineries to choose between selling out to him or facing bankruptcy. “Rockefeller and his associates did not build the Standard Oil Co. in the boardrooms of Wall Street,” wrote Ida Tarbell, a muckraking journalist of the day. “They fought their way to control by rebate and drawback, bribe and blackmail, espionage and price cutting, and perhaps more important, by ruthless, never slothful efficiency of organization.”

In 1906, President Theodore Roosevelt ordered his Justice Department to sue Standard Oil for antitrust violations. But government lawyers faced a quandary: It wasn’t illegal for Standard Oil to be a monopoly. It wasn’t even illegal to compete mercilessly. So government prosecutors found a new argument: If a firm is more powerful than everyone else, they said, it can’t simply act like everyone else. Instead, it has to live by a special set of rules, so that other companies get a fair shot. “The theory was that competition is good, and if a monopoly extinguishes competition, that’s bad,” says Herbert Hovenkamp, co-author of a seminal treatise on antitrust law. “Once you become a monopoly, you have to start acting differently, and if you don’t, then what you’ve been doing all along starts breaking the law.”

The Supreme Court agreed and split Standard Oil into 34 firms. (Rockefeller received stock in all of them and became even wealthier.) In the decades following the Standard Oil breakup, antitrust enforcement generally abided by a core principle: When a company grows so powerful that it becomes a gatekeeper, and uses that might to undermine competitors, then the government should intervene. And in the last century, as courts have censured other monopolies, academics and jurists have noticed a pattern: Monopolies and technology often seem intertwined. When a company discovers a technological advantage — like the innovations of Rockefeller’s scientists — it sometimes makes that firm so powerful that it becomes a monopoly almost without trying very hard. Many of the most important antitrust lawsuits in American history — against IBM, Alcoa, Kodak and others — were rooted in claims that one company had made technological discoveries that allowed it to outpace competitors.

For decades, there seemed to be a consensus among policymakers and business leaders (though not always among targeted companies) about how the antitrust laws should be enforced. But around the turn of this century, a number of tech companies emerged that caused some people to question whether the antitrust formula made sense anymore. Firms like Google and Facebook have become increasingly useful as they have grown bigger and bigger — a characteristic known as network effects. What’s more, some have argued that the online world is so fast-moving that no antitrust lawsuit can keep pace. Nowadays even the biggest titan can be defeated by a tiny start-up, as long as the newcomer has better ideas or faster tech. Antitrust laws, digital executives said, aren’t needed anymore.

Consider Microsoft. The government spent most of the 1990s suing Microsoft for antitrust violations, a prosecution that many now view as a complete waste of time and money. When Microsoft’s chief executive, Bill Gates, signed a consent decree to resolve one of its monopoly investigations in 1994, he told a reporter that it was essentially pointless for the company’s various divisions: “None of the people who run those divisions are going to change what they do or think.” Even after a federal judge ordered Microsoft broken into separate companies in 2000, the punishment didn’t take. Microsoft fought the ruling and won on appeal. The government then offered a settlement so feeble that nine states begged the court to reject the proposal. It was approved.

What eventually humbled Bill Gates and ended Microsoft’s monopoly wasn’t antitrust prosecutions, observers say, but a more nimble start-up named Google, a search engine designed by two Stanford Ph.D. dropouts that outperformed Microsoft’s own forays into search (first MSN Search and now Bing). Then those two dropouts introduced a series of applications, like Google Docs and Google Sheets, that eventually began to compete with almost every aspect of Microsoft’s businesses. And Google did all that not by relying on government prosecutors but by being smarter. You don’t need antitrust in the digital marketplace, critics argue. “When our products don’t work or we make mistakes, it’s easy for users to go elsewhere because our competition is only a click away,” Google’s co-founder, Larry Page, said in 2012. Translation: The government ought to stop worrying, because no online giant will ever survive any longer than it deserves to.

Once Foundem.com was available to everyone, the company’s honeymoon lasted precisely two days. During its first 48 hours, the Raffs saw a rush of traffic from users typing product queries into Google and other search engines. But then, suddenly, the traffic stopped. Alarmed, Adam and Shivaun began running diagnostics. They quickly discovered that their site, which until then had been appearing near the top of search results, was now languishing on Google, mired 12 or 15 or 64 or 170 pages down. On other search engines, like MSN Search and Yahoo, Foundem still ranked high. But on Google, Foundem had effectively disappeared. And Google, of course, was where a vast majority of people searched online.

As Amazon competition heats up, D.C. mayor heads west to talk tech – The Washington Post

The new European data protection law requires us to inform you of the following before you use our website:

We use cookies and other technologies to customize your experience, perform analytics and deliver personalized advertising on our sites, apps and newsletters and across the Internet based on your interests. By clicking “I agree” below, you consent to the use by us and our third-party partners of cookies and data gathered from your use of our platforms. See our Privacy Policy and Third Party Partners to learn more about the use of data and your rights. You also agree to our Terms of Service.

I agree

Innovator Awards Program 2018: Semifinalists | Healthcare Informatics Magazine – Healthcare Informatics

Editor’s Note: We at Healthcare Informatics were once again ecstatic with the exceptional quality of the submissions we received from innovating patient care organizations across the U.S. In addition to the four winning teams this year (whose stories will be posted throughout this week), our editorial team also selected several runners-up. Below, please find descriptions of the initiatives of the 14 teams whom we have awarded semifinalist status in this year’s program.

Centre for Addiction and Mental Health (Ontario, Canada)

Improving patient care through achievement of HIMSS EMRAM Stage 7

In May 2014, CAMH implemented a clinical information system using a big-bang approach with an integrated team of clinicians, information technology, and other staff. But after implementation, CAMH noted a lack of clinical practice standardization. A new initiative emerged that included work to refine the inputting of clinical documentation to the EHR, the development of electronic whiteboards to display and manage assessment and risk factors, leveraging data to inform improvement initiatives, and many other requirements as defined by the HIMSS EMRAM (EMR Adoption Model) Stage 7 criteria.

CAMH is the first academic teaching hospital to achieve HIMSS Stage 7 in Canada; this achievement is a milestone in both the Canadian and international health landscape. Now, more than 99 percent of CAMH clinically-relevant documentation is completed directly within the EHR and CPOE (computerized provider order entry) rates have been over 90 percent since December 2016. What’s more, the creation of a suicide risk dashboard has led to 90 percent of patients having a suicide risk assessment completed within 24 hours of admission.

Cleveland Clinic (Cleveland, Ohio)

An enterprise imaging service

The goal of the enterprise imaging service is to provide a comprehensive longitudinal medical record through incorporation of all medical images into a single archive. Through a universal viewer, the archive is integrated with the EHR and provides a foundation for image distribution to all caregivers throughout the enterprise. The archive also serves as a foundation for image sharing. Implementation required a comprehensive assessment of all image generating equipment throughout all hospitals and outpatient centers.

The Clinic’s officials say that the establishment of an enterprise imaging program has led to the consolidation of imaging archives throughout the health system. Images which were not previously easily accessible are now readily viewable through the EHR (electronic health record) with access points both within the firewall and from home. To date, 11 different service lines and more than 440 pieces of image generating equipment outside of radiology have been integrated.

Compass Medical (Massachusetts)

Annual wellness, chronic care management and quality outcomes

By leveraging new information technology, Compass Medical has been able to follow proven population health management and care management principles, allowing patient care leaders to identify and target specific population groups, stratify and prioritize care gaps and engage and individualize care plan activities. In 2016, for example, Compass Medical was able to identify and target more than 14,000 Medicare patients that were struggling to manage their chronic health conditions and needed a more personalized and comprehensive care plan. One year later, Compass Medical developed and launched a new Chronic Care Management Program to help engage with and closely manage Medicare patients that suffer from two or more chronic health conditions. With the help of its EHR and big data platform, Compass Medical positioned itself to automate many of the workflows for care management nurses.

The Annual Wellness Visit (AWV) is another example of a preventive care service that has been positively affected by leveraging IT. In 2017, national trends suggested utilization of AWV were still hovering around the low 20-percent range with the highest performing state reaching 35 percent. Utilizing EHR-based patient engagement campaigns for increasing focused outreach, incorporating a team based care model with scribes, and creating standard work processes for reducing provider burden have helped Compass Medical reach 57 percent AWV utilization for its Medicare eligible population by the end of year 2017.

Duke University School of Medicine (Durham, N.C.)

A NICU discrete event simulation model

Duke’s neonatal clinicians care for more than 800 babies each year in the Duke Neonatal Intensive Care Unit (NICU). Although the majority do well, about 40 babies do not survive. How could they improve outcomes and save lives? Duke’s neonatal research team partnered with analytics company SAS to create an analytics-based model of Duke Children’s Hospital’s Level IV neonatal intensive care unit. The result was the creation of a discrete event simulation model that closely resembled the clinical outcomes of Duke’s training unit, which was validated using data held back from the original model, which also closely tracked actual unit outcomes.

The model uses a vast resource of clinical data to simulate the experience of patients, their conditions and staff responses in a computerized environment. It creates virtual babies experiencing care within a simulated NICU environment, including virtual beds staffed by virtual nurses. The research team attests that they cannot find any evidence of discrete event simulation modeling being used in a NICU setting, making this a first in neonatal care.

Houston Methodist (Houston, Texas)

A coordinated care/Medicare Shared Savings Program (MSSP) initiative

Houston Methodist’s MSSP program, Houston Methodist Coordinated Care (HMCC), can track and report Medicare patients’ healthcare visits and medical details. The successful execution of the program is a layering of technologies with the foundation being the organization’s integrated EHR platform and a separate population management tool.

The project was centered around six core elements: 1) becoming the first ACO (accountable care organization) in Texas to acquire real-time admission, discharge and transfer (ADT) notification capability that links all health providers; 2) chronic heart failure home monitoring; 3) real-time notification when HMCC patients came into the ED; 4) risk assessments for emergency room visits, hospital readmissions and the need for complex care; 5) same-day appointment facilitation; and 6) care team alerts. In sum, there were 17,000 Houston Methodist patients in HMCC in 2017, year-to-date, with 105 participating physicians. Total healthcare cost savings year-to-date are more than $1.3 million, according to officials.

Indiana University Health (Indianapolis, Ind.)

FHIR HIEdrant: making big data actionable at the point of care

One of the difficult challenges for many HIEs (health information exchanges) is the time and effort that it takes to reach out to a second system to search for needed data at the point of care. As such, the goal at IU Health was to develop an application within the clinical workflow that will, at the click of a single button, bring back data to that workflow relating to the patient’s chief complaint from the HIE.

The first phase of this project was building the framework and the mechanisms to make this a possibility and apply it to a single context: an emergency department patient with chest pain. Leaders at IU Health are utilizing the Fast Healthcare Interoperability Resources (FHIR) standard to communicate out from the IU Health Cerner EHR to the HIE to retrieve five specific data elements that are germane to caring for a chest pain patient in the emergency department and understanding their risk. Within the workflow, the clinician is being presented the most recent: ECG, cardiology note, discharge summary, catheterization report, and more. According to IU Health officials, this is the first FHIR-based application that directly accesses an HIE and delivers context-specific data about a patient directly to the clinical workflow.

Johns Hopkins Health System (Baltimore, Md.)

inHealth precision medicine initiative

The precision medicine initiative at Johns Hopkins Medicine and University–inHealth–seeks to improve individual and population health outcomes through scientific advances at the intersection of biomedical research and data science. Through a collaboration of The Johns Hopkins Applied Physics Laboratory (APL), and Johns Hopkins Medicine (JHM), inHealth is building a big-data precision medicine platform with the goal of accelerating the translation of insight into care delivery.

The first result of this broad, multidisciplinary effort was the successful creation of two Precision Medicine Centers of Excellence (PMCoE) focused on multiple sclerosis and prostate active surveillance. The organization’s Technology Innovation Center has developed applications to garner new data and learnings from clinical practice and feedback into discovery. Physicians have begun using the discovery platform to facilitate conversations with their patients about their treatment options and risks. The experiences of these centers will lead the next wave of PMCoEs, expanding the utility of the platform.

Lakeland Health (St. Joseph, Mich.)

Something wicked this way comes

Leaders at Lakeland Health set three core cybersecurity goals: (a) put risk management and cybersecurity near the top of health system leadership agenda; (b) use innovative strategies and tools to execute the cybersecurity program; and (c) shift focus from fear to clinical integrity. The cybersecurity program covered the hospitals, clinics, home care, hospice and all the different legal entities which comprised the health system. In order to ensure strategic direction and alignment, a steering committee was set up which met every two weeks.

The cybersecurity program execution was focused on three work-streams—process, technology and team members. In the process work stream, execution covered implementation and audit of policies and procedures, risk assessment and HIPAA (Health Insurance Portability and Accountability Act) compliance, and a monthly information security executive dashboard which was reviewed by the steering committee. Despite this continuing threat, the cybersecurity program delivered strong results in different areas, including: more than 100 business associate agreements (BAA) were signed; annual HIPAA risk assessment and remediation plans were put in place; the initial internal phishing campaign eventually lowered the click rate to 10 percent; there was a five-fold increase in the suspicious emails forwarded to the security team; and more than 1,000 laptops were encrypted.

Lexington Clinic (Lexington, Ky.)

Development of a direct-to-employer network

Costs of certain services often vary dramatically between providers, so by selectively designing benefits to increase cost-sharing at providers who provide more expensive care, enrollees are incentivized to see the more efficient providers who provide care at a lower cost, reducing average overall expenditure. Savings can then be passed on to the employer. In this project, implementation was examined with an organization with a self-funded insurance model. Steering beneficiaries toward a tighter network of providers resulted in significant overall reductions in expenditure while improving the health of the overall employee population. Rather than limiting their employee health plan to a lower percentage of area providers like most similar plan designs, the employer entered into a direct-to-employer program with a local, multispecialty physician group: Lexington Clinic.

A key component of a direct-to-employer plan is population health. Lexington Clinic was able to utilize analytics software to deliver value to the employer by implementing high cost/high utilization analysis, undetected chronic disease engagement, and ancillary modality management. Lexington Clinic also determined that there were specific interventions that could be made at critical junctures in the care continuum of the employee population. These interventions would be designed to prevent health issues before they arise, reducing future expenditures and worsened health outcomes. Via the Lexington Clinic premier network, the employer demonstrated a clear reduction in aggregate expenditure from the 2015 to 2016 time period of more than 4 percent.

Lutheran Medical Center (Wheat Ridge, Colo.)

An app for staff engagement

At Lutheran Medical Center, it became a priority to redesign the way in which the staff was engaged. The organization started to use an anonymous crowdsourcing platform in 2016 with the goal to create recipes for success that would help leaders in the organization ask the right questions through an anonymous tool to enhance engagement. Using the tool has established a venue for staff to engage in problem solving and design ideas on their own terms in an anonymous way where all can follow along in the conversation in real time. The application/website started its use in the pharmacy department as a means to understand low engagement scores. This tool allowed for all staff to be involved while not taking them away from their daily duties.

Lutheran Medical Center was in the 11th percentile when it came to staff satisfaction only two years ago, but now ranks in the 43rd percentile compared to the national average. Having the ability to get staff buy-in before a change happens has been critical in impacting staff satisfaction. Before, only those invited to certain meetings had the opportunity to voice their opinions; now everyone can be reached with a single email or app use. It has been used for solving several clinical problems as well, such as how to design a “cord-free” patient room, and how to transport oxygen tanks around the hospital.

Mercy (St. Louis, Mo.)

Using NLP for heart failure EHR documentation

The goal of this project was to use NLP to extract key cardiology measures from physician and other clinical notes and incorporate the results into a dataset with discrete data fields. This dataset would then be used to obtain actionable information and contribute to the evaluation of outcomes of medical devices in heart failure patients.

Three key measures that are commonly stored in clinical notes and not available in discrete fields include ejection fraction measurement; patient symptoms including dyspnea, fatigue, dizziness, palpitations, edema and pulmonary congestion; and the New York Heart Association (NYHA) heart failure classification. Mercy patients had 35.5 million clinical notes from both inpatient and outpatient encounters that were extracted, processed and then loaded onto an NLP server. NLP queries were developed by a team of Mercy data scientists to search for relevant linguistic patterns and then evaluated for both precision and recall. The use of NLP in this project facilitated the extraction of vital patient information that is not available in any discrete field in the EHR. Without the ability to track the changes in these three essential measures, it becomes much harder to identify the point of disease progression which is a crucial factor for the evaluation of current treatments and could inform future interventions, according to Mercy officials.

Mosaic Life Care (St. Joseph, Mo.)

Revenue management analytics dashboard

Mosaic Life Care provides healthcare and life care services in and around St. Joseph, Missouri and the Kansas City Northland area. The organization’s finance and revenue cycle teams faced challenges with data silos that required caregivers to manually obtain information from disparate systems and manually collate information, subjecting the process to human error, inconsistent processes and concerns about data accuracy.

With the goal of developing a flexible “source of truth” dashboard, the enterprise data warehouse team developed an integrated revenue management analytics solution with a front-end dashboard by leveraging the core EDW solution and architecture platform to extract data from the best of breed systems. Through the new dashboard, financial analysts and management teams can perform analysis and predict future trends. As a result, the dashboard enables real-time, data-driven business decisions inclusive of multi-disparate systems within a single unified platform.

NYU Langone Health (New York City, N.Y.)

Value-based medicine to improve clinical care

The goal of the project was to leverage health IT tools and related workflows to improve the value of inpatient care. Finance collaborated with the project’s physician champions to identify variations in care both internally and compared to benchmarked external institutions. The project’s physician champions collaborated with IT physician informaticists and IT project teams to design interventions to both reduce cost and improve clinical care.

The suite of interventions included: electronic clinical pathways; blood protocols; intravenous (IV) to oral (PO) medication changes; and lab ordering enhancements. Electronic pathways were created for heart failure, colon surgery, and pneumonia, and blood ordering clinical decision support and analytics were built. These projects realized significant two-year savings, including: electronic clinical pathways: $12.9 million; lab modifications: $3 million; blood utilization: $2.9 million; and IV to PO: $2.2 million.

Penn Medicine (Philadelphia, Pa.)

Standard clinical iPhone effectively enhances patient care

In January 2016, Penn Medicine met with Apple engineers to develop an economic and efficient full configuration Standard Clinical iPhone (SCiP) to work with Penn’s mobile device management tool while leveraging Apple’s Device Enrollment Program (DEP) and Volume Purchasing Program (VPP). Using this method saved the organization 975 man hours in its initial deployment using DEP streamlined setup (15 minutes versus one hour for each iPhone). Pushing additional apps to devices without needing an Apple ID and password for download or manual touch also made the implementation efficient. By implementing this project, it placed a vital tool into caregivers’ hands, officials say.

Just by using the secure texting, clinicians were able to coordinate patient flow across care settings with multiple providers and mended gaps in communication. One example had the cardiac surgery team on a thread with the patient’s current caregivers. The nurse took a picture of the surgical site and sent it securely with a description, concerned about swelling around the surgical site. The surgical team was able to provide immediate feedback and resolve the issue remotely.

Innovator Awards Program 2018: Semifinalists – Healthcare Informatics

Editor’s Note: We at Healthcare Informatics were once again ecstatic with the exceptional quality of the submissions we received from innovating patient care organizations across the U.S. In addition to the four winning teams this year (whose stories will be posted throughout this week), our editorial team also selected several runners-up. Below, please find descriptions of the initiatives of the 14 teams whom we have awarded semifinalist status in this year’s program.

Centre for Addiction and Mental Health (Ontario, Canada)

Improving patient care through achievement of HIMSS EMRAM Stage 7

In May 2014, CAMH implemented a clinical information system using a big-bang approach with an integrated team of clinicians, information technology, and other staff. But after implementation, CAMH noted a lack of clinical practice standardization. A new initiative emerged that included work to refine the inputting of clinical documentation to the EHR, the development of electronic whiteboards to display and manage assessment and risk factors, leveraging data to inform improvement initiatives, and many other requirements as defined by the HIMSS EMRAM (EMR Adoption Model) Stage 7 criteria.

CAMH is the first academic teaching hospital to achieve HIMSS Stage 7 in Canada; this achievement is a milestone in both the Canadian and international health landscape. Now, more than 99 percent of CAMH clinically-relevant documentation is completed directly within the EHR and CPOE (computerized provider order entry) rates have been over 90 percent since December 2016. What’s more, the creation of a suicide risk dashboard has led to 90 percent of patients having a suicide risk assessment completed within 24 hours of admission.

Cleveland Clinic (Cleveland, Ohio)

An enterprise imaging service

The goal of the enterprise imaging service is to provide a comprehensive longitudinal medical record through incorporation of all medical images into a single archive. Through a universal viewer, the archive is integrated with the EHR and provides a foundation for image distribution to all caregivers throughout the enterprise. The archive also serves as a foundation for image sharing. Implementation required a comprehensive assessment of all image generating equipment throughout all hospitals and outpatient centers.

The Clinic’s officials say that the establishment of an enterprise imaging program has led to the consolidation of imaging archives throughout the health system. Images which were not previously easily accessible are now readily viewable through the EHR (electronic health record) with access points both within the firewall and from home. To date, 11 different service lines and more than 440 pieces of image generating equipment outside of radiology have been integrated.

Compass Medical (Massachusetts)

Annual wellness, chronic care management and quality outcomes

By leveraging new information technology, Compass Medical has been able to follow proven population health management and care management principles, allowing patient care leaders to identify and target specific population groups, stratify and prioritize care gaps and engage and individualize care plan activities. In 2016, for example, Compass Medical was able to identify and target more than 14,000 Medicare patients that were struggling to manage their chronic health conditions and needed a more personalized and comprehensive care plan. One year later, Compass Medical developed and launched a new Chronic Care Management Program to help engage with and closely manage Medicare patients that suffer from two or more chronic health conditions. With the help of its EHR and big data platform, Compass Medical positioned itself to automate many of the workflows for care management nurses.

The Annual Wellness Visit (AWV) is another example of a preventive care service that has been positively affected by leveraging IT. In 2017, national trends suggested utilization of AWV were still hovering around the low 20-percent range with the highest performing state reaching 35 percent. Utilizing EHR-based patient engagement campaigns for increasing focused outreach, incorporating a team based care model with scribes, and creating standard work processes for reducing provider burden have helped Compass Medical reach 57 percent AWV utilization for its Medicare eligible population by the end of year 2017.

Duke University School of Medicine (Durham, N.C.)

A NICU discrete event simulation model

Duke’s neonatal clinicians care for more than 800 babies each year in the Duke Neonatal Intensive Care Unit (NICU). Although the majority do well, about 40 babies do not survive. How could they improve outcomes and save lives? Duke’s neonatal research team partnered with analytics company SAS to create an analytics-based model of Duke Children’s Hospital’s Level IV neonatal intensive care unit. The result was the creation of a discrete event simulation model that closely resembled the clinical outcomes of Duke’s training unit, which was validated using data held back from the original model, which also closely tracked actual unit outcomes.

The model uses a vast resource of clinical data to simulate the experience of patients, their conditions and staff responses in a computerized environment. It creates virtual babies experiencing care within a simulated NICU environment, including virtual beds staffed by virtual nurses. The research team attests that they cannot find any evidence of discrete event simulation modeling being used in a NICU setting, making this a first in neonatal care.

Houston Methodist (Houston, Texas)

A coordinated care/Medicare Shared Savings Program (MSSP) initiative

Houston Methodist’s MSSP program, Houston Methodist Coordinated Care (HMCC), can track and report Medicare patients’ healthcare visits and medical details. The successful execution of the program is a layering of technologies with the foundation being the organization’s integrated EHR platform and a separate population management tool.

The project was centered around six core elements: 1) becoming the first ACO (accountable care organization) in Texas to acquire real-time admission, discharge and transfer (ADT) notification capability that links all health providers; 2) chronic heart failure home monitoring; 3) real-time notification when HMCC patients came into the ED; 4) risk assessments for emergency room visits, hospital readmissions and the need for complex care; 5) same-day appointment facilitation; and 6) care team alerts. In sum, there were 17,000 Houston Methodist patients in HMCC in 2017, year-to-date, with 105 participating physicians. Total healthcare cost savings year-to-date are more than $1.3 million, according to officials.

Indiana University Health (Indianapolis, Ind.)

FHIR HIEdrant: making big data actionable at the point of care

One of the difficult challenges for many HIEs (health information exchanges) is the time and effort that it takes to reach out to a second system to search for needed data at the point of care. As such, the goal at IU Health was to develop an application within the clinical workflow that will, at the click of a single button, bring back data to that workflow relating to the patient’s chief complaint from the HIE.

The first phase of this project was building the framework and the mechanisms to make this a possibility and apply it to a single context: an emergency department patient with chest pain. Leaders at IU Health are utilizing the Fast Healthcare Interoperability Resources (FHIR) standard to communicate out from the IU Health Cerner EHR to the HIE to retrieve five specific data elements that are germane to caring for a chest pain patient in the emergency department and understanding their risk. Within the workflow, the clinician is being presented the most recent: ECG, cardiology note, discharge summary, catheterization report, and more. According to IU Health officials, this is the first FHIR-based application that directly accesses an HIE and delivers context-specific data about a patient directly to the clinical workflow.

Johns Hopkins Health System (Baltimore, Md.)

inHealth precision medicine initiative

The precision medicine initiative at Johns Hopkins Medicine and University–inHealth–seeks to improve individual and population health outcomes through scientific advances at the intersection of biomedical research and data science. Through a collaboration of The Johns Hopkins Applied Physics Laboratory (APL), and Johns Hopkins Medicine (JHM), inHealth is building a big-data precision medicine platform with the goal of accelerating the translation of insight into care delivery.

The first result of this broad, multidisciplinary effort was the successful creation of two Precision Medicine Centers of Excellence (PMCoE) focused on multiple sclerosis and prostate active surveillance. The organization’s Technology Innovation Center has developed applications to garner new data and learnings from clinical practice and feedback into discovery. Physicians have begun using the discovery platform to facilitate conversations with their patients about their treatment options and risks. The experiences of these centers will lead the next wave of PMCoEs, expanding the utility of the platform.

Lakeland Health (St. Joseph, Mich.)

Something wicked this way comes

Leaders at Lakeland Health set three core cybersecurity goals: (a) put risk management and cybersecurity near the top of health system leadership agenda; (b) use innovative strategies and tools to execute the cybersecurity program; and (c) shift focus from fear to clinical integrity. The cybersecurity program covered the hospitals, clinics, home care, hospice and all the different legal entities which comprised the health system. In order to ensure strategic direction and alignment, a steering committee was set up which met every two weeks.

The cybersecurity program execution was focused on three work-streams—process, technology and team members. In the process work stream, execution covered implementation and audit of policies and procedures, risk assessment and HIPAA (Health Insurance Portability and Accountability Act) compliance, and a monthly information security executive dashboard which was reviewed by the steering committee. Despite this continuing threat, the cybersecurity program delivered strong results in different areas, including: more than 100 business associate agreements (BAA) were signed; annual HIPAA risk assessment and remediation plans were put in place; the initial internal phishing campaign eventually lowered the click rate to 10 percent; there was a five-fold increase in the suspicious emails forwarded to the security team; and more than 1,000 laptops were encrypted.

Lexington Clinic (Lexington, Ky.)

Development of a direct-to-employer network

Costs of certain services often vary dramatically between providers, so by selectively designing benefits to increase cost-sharing at providers who provide more expensive care, enrollees are incentivized to see the more efficient providers who provide care at a lower cost, reducing average overall expenditure. Savings can then be passed on to the employer. In this project, implementation was examined with an organization with a self-funded insurance model. Steering beneficiaries toward a tighter network of providers resulted in significant overall reductions in expenditure while improving the health of the overall employee population. Rather than limiting their employee health plan to a lower percentage of area providers like most similar plan designs, the employer entered into a direct-to-employer program with a local, multispecialty physician group: Lexington Clinic.

A key component of a direct-to-employer plan is population health. Lexington Clinic was able to utilize analytics software to deliver value to the employer by implementing high cost/high utilization analysis, undetected chronic disease engagement, and ancillary modality management. Lexington Clinic also determined that there were specific interventions that could be made at critical junctures in the care continuum of the employee population. These interventions would be designed to prevent health issues before they arise, reducing future expenditures and worsened health outcomes. Via the Lexington Clinic premier network, the employer demonstrated a clear reduction in aggregate expenditure from the 2015 to 2016 time period of more than 4 percent.

Lutheran Medical Center (Wheat Ridge, Colo.)

An app for staff engagement

At Lutheran Medical Center, it became a priority to redesign the way in which the staff was engaged. The organization started to use an anonymous crowdsourcing platform in 2016 with the goal to create recipes for success that would help leaders in the organization ask the right questions through an anonymous tool to enhance engagement. Using the tool has established a venue for staff to engage in problem solving and design ideas on their own terms in an anonymous way where all can follow along in the conversation in real time. The application/website started its use in the pharmacy department as a means to understand low engagement scores. This tool allowed for all staff to be involved while not taking them away from their daily duties.

Lutheran Medical Center was in the 11th percentile when it came to staff satisfaction only two years ago, but now ranks in the 43rd percentile compared to the national average. Having the ability to get staff buy-in before a change happens has been critical in impacting staff satisfaction. Before, only those invited to certain meetings had the opportunity to voice their opinions; now everyone can be reached with a single email or app use. It has been used for solving several clinical problems as well, such as how to design a “cord-free” patient room, and how to transport oxygen tanks around the hospital.

Mercy (St. Louis, Mo.)

Using NLP for heart failure EHR documentation

The goal of this project was to use NLP to extract key cardiology measures from physician and other clinical notes and incorporate the results into a dataset with discrete data fields. This dataset would then be used to obtain actionable information and contribute to the evaluation of outcomes of medical devices in heart failure patients.

Three key measures that are commonly stored in clinical notes and not available in discrete fields include ejection fraction measurement; patient symptoms including dyspnea, fatigue, dizziness, palpitations, edema and pulmonary congestion; and the New York Heart Association (NYHA) heart failure classification. Mercy patients had 35.5 million clinical notes from both inpatient and outpatient encounters that were extracted, processed and then loaded onto an NLP server. NLP queries were developed by a team of Mercy data scientists to search for relevant linguistic patterns and then evaluated for both precision and recall. The use of NLP in this project facilitated the extraction of vital patient information that is not available in any discrete field in the EHR. Without the ability to track the changes in these three essential measures, it becomes much harder to identify the point of disease progression which is a crucial factor for the evaluation of current treatments and could inform future interventions, according to Mercy officials.

Mosaic Life Care (St. Joseph, Mo.)

Revenue management analytics dashboard

Mosaic Life Care provides healthcare and life care services in and around St. Joseph, Missouri and the Kansas City Northland area. The organization’s finance and revenue cycle teams faced challenges with data silos that required caregivers to manually obtain information from disparate systems and manually collate information, subjecting the process to human error, inconsistent processes and concerns about data accuracy.

With the goal of developing a flexible “source of truth” dashboard, the enterprise data warehouse team developed an integrated revenue management analytics solution with a front-end dashboard by leveraging the core EDW solution and architecture platform to extract data from the best of breed systems. Through the new dashboard, financial analysts and management teams can perform analysis and predict future trends. As a result, the dashboard enables real-time, data-driven business decisions inclusive of multi-disparate systems within a single unified platform.

NYU Langone Health (New York City, N.Y.)

Value-based medicine to improve clinical care

The goal of the project was to leverage health IT tools and related workflows to improve the value of inpatient care. Finance collaborated with the project’s physician champions to identify variations in care both internally and compared to benchmarked external institutions. The project’s physician champions collaborated with IT physician informaticists and IT project teams to design interventions to both reduce cost and improve clinical care.

The suite of interventions included: electronic clinical pathways; blood protocols; intravenous (IV) to oral (PO) medication changes; and lab ordering enhancements. Electronic pathways were created for heart failure, colon surgery, and pneumonia, and blood ordering clinical decision support and analytics were built. These projects realized significant two-year savings, including: electronic clinical pathways: $12.9 million; lab modifications: $3 million; blood utilization: $2.9 million; and IV to PO: $2.2 million.

Penn Medicine (Philadelphia, Pa.)

Standard clinical iPhone effectively enhances patient care

In January 2016, Penn Medicine met with Apple engineers to develop an economic and efficient full configuration Standard Clinical iPhone (SCiP) to work with Penn’s mobile device management tool while leveraging Apple’s Device Enrollment Program (DEP) and Volume Purchasing Program (VPP). Using this method saved the organization 975 man hours in its initial deployment using DEP streamlined setup (15 minutes versus one hour for each iPhone). Pushing additional apps to devices without needing an Apple ID and password for download or manual touch also made the implementation efficient. By implementing this project, it placed a vital tool into caregivers’ hands, officials say.

Just by using the secure texting, clinicians were able to coordinate patient flow across care settings with multiple providers and mended gaps in communication. One example had the cardiac surgery team on a thread with the patient’s current caregivers. The nurse took a picture of the surgical site and sent it securely with a description, concerned about swelling around the surgical site. The surgical team was able to provide immediate feedback and resolve the issue remotely.

A cybersecurity expert explains how to fight Russian election meddling – Vox

There is little room for doubt that Russia interfered in the 2016 election. The Justice Department on Friday handed down indictments to 13 Russian people and three Russian companies for meddling in United States political and election processes, the latest item in a litany of evidence that Russia, well, did it.

Even scarier, there is every indication that Russia is likely to try to interfere in the American political process again — and many of the technologies, trends, and processes it exploited in the past are largely unchanged. (Catch that New York Times story on the Twitter bot factories?)

“I’ll tell you right up front, it is going to happen again,” Greg Touhill, a retired Air Force general officer and one of the nation’s premier cybersecurity experts, told me. Touhill is currently president of Cyxtera Federal Group, a secure infrastructure company. Before that, he served in a wide range of government roles, including as the first United States chief information security officer in 2016.

I spoke with Touhill about what the United States can do to stop Russia from interfering in US politics and elections in 2018 and beyond. While the federal government certainly has a major role to play — in deterring future interference, in supporting state and local election officials, and in boosting national security efforts — Touhill noted that the technology companies Russians use as a conduit in their disinformation campaign have a responsibility as well.

So do everyday Americans, in using good judgment when they’re reading news sources: “If it sounds phony, it probably is,” he said.

This interview has been edited and condensed for clarity.

Emily Stewart

We keep getting more details about Russian meddling in the 2016 election, including Friday’s indictments, and we’re also seeing warnings that Russians are likely to try something again in 2018. What can and should the federal government and other entities be doing so that we don’t see this happen again?

Greg Touhill

I’ll tell you right up front, it is going to happen again. It’s happened before, and frankly, it’s happened throughout all of time. A different way to phrase it is how do we prepare ourselves to deal with this when it happens again? And how do we mitigate it and the like?

Information operations, influence operations, or whatever you want to call it — and different nations call it different things — people have recognized, as Francis Bacon used to say, knowledge is power. They’re constantly trying to seek the ability to influence and get knowledge and get an information advantage. From my perch, I think that we want to deter further action, we want to mitigate it when it does happen, and we want to take action that’s effective and proportionate when we do detect that somebody is breaking international norms.

Emily Stewart

How do you balance deterrence of future action against retaliation or punishment of past action? How would you approach it?

Greg Touhill

If you take a look at all the different instruments of power that are available to the United States, we have the military option, which as a retired officer I think should be the last resort, but certainly it should be on the table for consideration, particularly when it comes to deterrence. We also have the political, the economic, and the diplomatic means as well.

First things first is you have to — when you see somebody who is breaking norms and is engaged in things that we don’t believe as an international community are the right things to do — you need to confront that, and you need to present the evidence that says, “Hey, here is where you are breaking the norms.”

We have been working, from the United States government, on a very leadership, forward-thinking approach to cyber norms. That should be a priority in the international community, and the United States should take a continuous leadership role in making sure that we have a clear understanding and articulation of acceptable behavior in the cyber domain, and affirmation of the cyber norms that have been already proposed needs to be a priority for our diplomat efforts.

Secondly, when we see folks that are deviating from those norms, there needs to be some accountability, and that’s where we have the ability under our current legal framework to issue economic sanctions, diplomatic sanctions, and, in [Friday’s] case, legal indictments, where we are trying to hold individuals and states accountable for violating law and, as I mentioned, norms of acceptable behavior.

Emily Stewart

What agencies or entities within the government need to take the lead here?

Greg Touhill

Frankly, this is a whole of government issue. And as you take a look at all those instruments of national power, it’s distributed across departments and agencies. That’s a reason why in 1947 we established the National Security Council to help coordinate a lot of the activities dealing with national security.

I would submit that our national security and our national prosperity is intrinsically linked to cybersecurity and the integrity of information technology and the information that’s contained within it. You name me a business or an institution or a societal institution itself that doesn’t rely on IT right now, it’s very difficult. As we take a look at the roles across the federal government — the Department of State, the Department of Treasury, the Department of Homeland Security, the Department of Defense, the Department of Commerce, the Department of Justice — virtually every single major department and agency has a stake in those elements of national power that we could use and leverage to deal with issues of deterrence and proper response to cyberattacks.

The National Security Council, working under the National Command Authority, that’s where I’m looking for leadership to coordinate all instruments of national power.

Emily Stewart

What about the president? On Friday, the indictments come down, and he says, “No collusion!”

Greg Touhill

I don’t necessarily see the discussion of collusion being the same as to acknowledge that we have an issue with Russian-based actors engaged in influence operations against the United States. I took the collusion issue as a separate domestic issue as opposed to the actual influence operations.

I believe that the evidence we’ve seen thus far points toward Russian-based actors engaged in targeted influence operations directed against the people of the United States with what appears to be an ultimate goal to undermine democratic institutions in the United States.

Emily Stewart

Well, but Trump doesn’t seem hyper concerned about Russia; he seems to be downplaying it.

Greg Touhill

I don’t know President Trump, nor do I know his leadership style, so I really can’t comment on that.

It’s very possible, and I wouldn’t rule it out, that he has directed the National Security Council to provide him different options, and as you take a look at activities at [a] nation-state level, many of those deliberations are going to be held in very classified settings. At this point, I really can’t comment because I don’t know what he’s directing in the background. Nor would I expect, if it were President Obama or President Bush or President Clinton or any of his predecessors — this is really an important topic, and I’m confident that the National Security Council is in fact looking at all different options that would be on the table and advising the president as such.

Emily Stewart

Beyond the government and the president, what do companies like Facebook and Twitter, which seem to be a major part of what happened in 2016, need to be doing?

Greg Touhill

If you look at it through the lens of cybersecurity, I think there are three major lenses: people, process, and technology. You’re taking a look at all sorts of different media platforms that could include Twitter, Facebook, and the like, which under social media are powerful platforms. You want to make sure you get it right.

You want to make sure that your people are properly trained to maintain the integrity of product and information that you’re putting out. You want to make sure that you have the proper processes in place to properly vet input so that you, in fact, are not putting out, for lack of a better term, “fake news.” It’s almost like yelling, “Fire!” in a movie theater: You want to make sure that you are, in fact, accurate and that your product is trusted. You want to put in right technologies to make sure that you have positive control over that information that you’re sharing.

There are plenty of tools that are currently developed and being fielded right now that can help on the technology standpoint, and certainly training and processes are part of good order and discipline in any business these days. From a technology standpoint, you should not let anybody have access to your information or equipment or systems and the like.

Having positive control over the platforms themselves is critically important. Technologies such as software-defined perimeters that are identity-centric and really go down and validate authorities and identities prior to connecting and doing authorization first and connection second as a technology is critically important. As you see more and more companies that want to make sure they have positive control over their tech to protect the information inside it are switching to things like software-defined perimeters, regardless of what industry they’re in — finance, social media, etc.

I am heartened, though, by the rhetoric of some of the companies, where they’re coming out and saying, “Hey, we’re putting things in so people, if they see something, they can say something, question whether or not this is fake news.” That’s a step in the right direction, but I want to see more.

Emily Stewart

I’m interested in this question of whether social media companies need to know their customers. Banks are subject to know-your-customer and anti-money laundering laws; can’t technology companies be too? At the same time, with those sorts of regulations, you tend to hear protests on the First Amendment front — namely, shouldn’t people be able to say whatever they want, presumably, on Twitter, even if it is a bot?

Greg Touhill

That’s gets back to yelling, “Fire!” in a movie theater. There was a great debate about 100 years ago as to First Amendment rights. Do you have the right to yell, “Fire!” in the movie theater if public safety is at risk? If we take a look at different companies that are out there, do they in fact have the code of ethics to make sure the information presented is in fact proper?

Google, what’s their theme? Do no harm, right? If Google is serving up info that may in fact be harmful, is that contrary to their own ethics? It’s a heavy issue, and I’m not necessarily a philosopher, but professor Touhill would tell you that you’ve got a great capability, and technology doesn’t always solve every problem. Leadership is needed at all levels, including in the technology areas to try to combat this problem.

And as I also tell my mother, you need to not draw conclusions from a single news source; you need to go survey the whole landscape. I believe that freedom of the press here in the United States is one of our greatest strengths, and I expect the press to do their bit too, to make sure that when they’re seeing fake news they’re pulling it out so that we can, in fact, all work together as a team, as a people, to make sure that the general population gets the right news, the truth. That’s what we’re all looking for. It’s more than just technology.

Emily Stewart

Along those lines, beyond the government, tech companies, the press, what about me, sitting at home on my computer? Is there some role citizens need to play in this in being smarter in the way that they consume news and information?

Greg Touhill

There are some very straightforward things that every citizen can and should be doing.

One is don’t believe everything you see online. Do your homework, go check multiple sources, make sure that you are staying away from suspicious websites, go to news sources that are trusted and maintain that same level of integrity as you would hope that you would be promoting yourself. You want to get your news from folks who will double-check and triple-check their sources, that are unimpeachable, that recognize their responsibility. And if it’s coming from a news source that you don’t know, then it’s probably not necessarily a trusted source. That’s the first thing.

Second thing, follow the advice I gave my mother — get your news from multiple sources. There’s more than one network on TV, and there’s more than one newspaper online. The great news organizations have at their core the same story, but they give you different analyses, different perspectives. If you want to be better educated into the news, you’re better served by understanding those different perspectives. Make sure that you’re doing your homework and not necessarily going to just one news source.

Third, if it sounds phony, it probably is. Dig deeper when you see things that seem outrageous. You may find that things that are particularly outrageous, if it’s not coming from a trusted news source, it’s probably is made up.

Emily Stewart

In wrapping up, going forward, just looking at the next six months, if you could pick out three things that the federal government could do to safeguard election integrity, what do you think they should do?

Greg Touhill

Number one, work with state governments — state, local, county, tribal, territorial governments — because all elections are managed locally. The federal government does not go out and do voter registration; the federal government does not do the collection of votes, and the federal government does not do the tabulation of votes. That’s all done locally and up to the state level.

Its’ really important for the federal government to work with the states and the counties to make sure they are hardened. I mentioned those three processes — voter registration, the actual casting of the ballot, and the actual tabulation, counting the votes — three individual processes that are all critical.

That’s all done at the state level, [but] the federal government can assist the states on that. They can assist with best practices, and having been director of the NCCIC for a while — that’s the National Cybersecurity and Communications Integration Center — which has the US-CERT and the ICS-CERT, the industrial control systems certification, we went out and reached out to the secretaries of state in different states and offered assistance.

There’s a lot of discussion right now as to how the states want to use the capabilities and best practices and the like, but I think that’s something that still needs to be at the top of the agenda at the state level as well as within the Department of Homeland Security to help.

Two, from an influence operations standpoint, we have to do counter influence operations, and I think we’ve already started a lot of that. We need to make sure that the American people understand that there are influence operations that are, in fact, being conducted against us, and the media has been really good as of late, for example, highlighting the fact that we had the major intelligence leaders testifying before Congress this past week, raising that alert.

The next step is for the federal government to actually have a plan on how to educate and inform citizens as to, “What do I need to do in an environment where influence operations are ongoing?” That’s going to be very difficult for the United States government to do given the fact that we cherish freedom of the press and our First Amendment, but we do need to make sure that we have an educated and informed populace.

The third thing that the federal government should be doing, in my opinion, is be[ing] very clear from a deterrence standpoint what the consequences would be for any entity that is trying to interfere with our free and open democratic processes. There should be accountability. There should be activity leveraging diplomatic and other instruments of national power to deter any entity from attacking our most cherished democratic institutions.

SVVSD senior wins National Center for Women and Information Technology award – Boulder Daily Camera

Michelle Tran, a senior at Niwot High School, credits a middle school robotics club and finding a community through the National Center for Women and Information Technology for her ongoing love of computer science.

Tran is one of 41 national winners of the National Center for Women and Information Technology’s Aspirations in Computing Award.

“It opened up a whole community of women in technology,” said Tran, who entered the contest for the third time this school year. “I found that amazing, and I kept pursuing it to see if could rank higher.”

The contest is open to high school girls in grades nine through 12 and recognizes “demonstrated interest and achievements in computing, proven leadership ability and academic performance.” This year, the 41 winners and 350 honorable mentions were chosen from about 3,600 applicants.

Fairview High senior Andrea Lin and Skyline High senior Corinne Herbst both received honorable mentions.

Tran described herself as “technologically incompetent” in middle school, joining the robotics team because she wanted to improve.

“I realized my love for problem solving translated quite easily to computer science,” she said. “It’s all about using technology to problem solve.”


Advertisement

She’s so passionate about the value of introducing middle schoolers, especially girls, to computer science that she’s teaching a middle school computer animation class.

“It’s before they’re told that it’s not cool,” she said.

At St. Vrain Valley’s Innovation Center, she’s working on a drone team to program an android app to send weather data to the National Center for Atmospheric Research. She’s also the co-leader of the aquatic robotics team, which is building a submarine to inspect Longmont’s water infrastructure.

“Without the Innovation Center, I don’t think I would have gotten this award,” she said. “The real-life tech applications were invaluable.”

She also mentors robotics teams and is on a technology team of students who are Apple certified to repair student devices.

Like Tran, Herbst said her introduction to computer science came in middle school. She joined a middle school computer science club for girls.

“I just loved the ability to manipulate and create things online and all of the different things I could do with it,” she said.

Her work at the Innovation Center includes building and programming a prosthetic arm and a heart rate monitor as part of the biomedical engineering team. In her senior design class at Skyline, she’s also working on a headband to detect hypothermia.

Herbst, who plans to double major in biomedical engineering and mechanical engineering at Colorado State University, said having an organization like the National Center for Women and Information Technology to support high school girls interested in computer science is “extremely helpful.”

“There’s such a small percentage of girls who go into that field,” she said. “It’s a really difficult major to go into. Without support, a lot of girls get frustrated and switch to something else.”

Fairview’s Andrea Lin maxed out on computer science classes her sophomore year and started working on the University of Colorado’s PhET Interactive Simulations Project.

Founded in 2002 by Nobel Laureate and former CU physics professor Carl Wieman, the PhET project creates free, interactive math and science simulations for educators.

Lin worked on a project rewriting code for the most popular simulations to make them work better on web-based iPads and Chromebooks. Her revised simulation was published in August.

Lin said she’s continuing to suggest improvements based on feedback from Fairview science teachers and has taken on smaller projects, such as documenting code and game design.

“I am very honored to have received their recognition,” she said about her National Center for Women and Information Technology honorable mention.

Amy Bounds: 303-473-1341, boundsa@dailycamera.com or twitter.com/boundsa

‘Cultivating an Atmosphere of Inclusiveness’ – UC Davis

“At UC Davis, we acknowledge and honor exemplary faculty, staff, students and community members who help to cultivate an atmosphere of inclusiveness. They speak to the heart of what makes our campus and region a great place to work, teach, learn, play and live.”

This is part of what Gary S. May had to say Feb. 6 in presenting the 2018 Chancellor’s Achievement Awards for Diversity and Community to eight individuals — in the categories of Academic Senate, Academic Federation, undergraduate, graduate student, postdoctoral, staff, special recognition and community — and three departments.

The awards ceremony took place in the early evening at the Chancellor’s Residence. “This event is a perfect way to cap my workday,” May said. “The spirit of these awards speaks to me deeply on a personal and professional level” — as a college student who remembers well the feeling of being the only person of color in the lecture halls and laboratories, and as an engineering professor and dean working hard to change that, especially for students from ethnic groups that are underrepresented in the STEM fields.

“UC Davis’ strong commitment to diversity is one of the key reasons I wanted to come here,” May said. “I wanted to be part of a community that deliberately recruits, retains, embraces and celebrates people with backgrounds, gender identities and skill sets that are underrepresented in higher education. I wanted to be part of a community that honors the promoters of socio-economic mobility who we are celebrating today.”

Here are the 2018 award recipients, with comments about them condensed from nomination forms and remarks from the awards ceremony, delivered by Rahim Reed, associate executive vice chancellor, Office of Campus Community Relations. You can read more about the awardees here.

Individual award recipients

Academic Senate: Natalia Deeb-Sossa

Associate professor of Chicano/a studies, recognized for her socially and politically engaged scholarship, community outreach and contributions to marginalized communities. For example, she founded the Knights Landing Bridge Program, now known as the UC Davis Chicana/o Bridge Program to reflect that UC Davis students provide “bridge” tutoring not only in Knights Landing but in other rural communities, as well. “As a professor, she is highly regarded by her students who often highlight her willingness to support them beyond traditional teaching duties.”

Academic Federation: Jorge Garcia

Jorge Garcia mugshotClinical professor of internal medicine; and interim associate director, Office of Student and Resident Diversity. “His efforts have helped to ensure that UC Davis welcomes diversity with open arms. … Although he is an accomplished physician he has never forgotten the awkwardness and isolation he felt in embarking on a career in medicine, and then in academic medicine. This is why Dr. Garcia relishes his position as a role model and inspirational coach for underrepresented students in medicine.”

Undergraduate: Samantha Chiang

Samantha Chiang mugshotShe is a fourth-year, English major and Asian American studies minor, and a former ASUCD senator (2016-17). “Her passion for assisting marginalized and underrepresented communities is a reflection of her deep desire to create a more equitable and inclusive campus environment.” She is the founding director of the UC Davis Mental Health Initiative, which runs the annual mental health conference and awareness month, and has also worked in the areas of disability rights and cultural competency training. She worked with Student Health and Counseling Services to create translated insurance documents in Mandarin and Spanish.

Graduate Student: Hung Doan

Hung Doan mugshotThis plant pathology student believes that service is at the heart of scholarship. He mentors undergraduates from underrepresented groups, and he works to alleviate food insecurity within the UC Davis student community (especially among underrepresented students) and in the surrounding community. Since 2011, he has worked as coordinator and head cook for the student-run soup kitchen HELP, which stands for Help and Education Leading to the Prevention of Poverty.

Postdoctoral: Lauren Libero

Lauren Libero mugshotShe studies at the MIND Institute, where she is the volunteer co-leader of a social skills program for autistic adults and family members, and a support group leader. One of those groups, for family members of people on the autism spectrum, was on the verge of shutting down, due to a staff retirement, until Ribero advocated to keep it going with her as the lead staff member. She started a support group for women on the autism spectrum, and mentors children and young adults in theater and improvisation to enhance their communication skills.

Staff Award: Lina Mendez

Lina Mendez mugshotAssociate director, Center for Chicanx and Latinx Academic Student Success. “Through her research as well as her lived experiences and journey in support of the Chicanx and Latinx student communities, she has focused on channeling their potential in the pursuit of educational excellence, while also working to shape the institutions that serve them” — including the Center for Educational Effectiveness (as a graduate student) and the UC Davis Health Center for Reducing Health Disparities (as a post-doc).

Special Recognition: Barbara Ashby

Barbara Ashby mugshotThe manager of WorkLife and Wellness has devoted her career to program and policy development in support of women, children and families. She secured grants and other funding to assist student parents with child care expenses, and established three child care facilities serving more than 300 children. She founded the Breastfeeding Support Program, and she also was instrumental in workplace flexibility policy. More recently she collaborated with the Women’s Resources and Research Center to establish the Caregiver Support Group and Education Program.

Community Achievement: Cassandra Jennings

Cassandra Jennings mugshotPresident and chief executive officer, Greater Sacramento Urban League, who formerly worked in Sacramento city government and at the Sacramento Housing and Redevelopment Agency, including six years as deputy executive director. In her three years in the Urban League’s top leadership post, she has assisted UC Davis’ outreach efforts in underserved communities in Sacramento through Sacramento Area Youth Speaks, or SAYS, a UC Davis-run program that is now co-located at Urban League headquarters in Del Paso Heights.

Honorary awards

The campus introduced this category last year to recognize departments and divisions for taking the initiative to include training in diversity and inclusion as part of organizational and staff development.

“These efforts are in support of the UC Davis Diversity and Inclusion Initiative, and it is our hope that the campus community will be inspired by these organizations’ proactive measures in operationalizing our Principles of Community, and in striving towards a more diverse and inclusive UC Davis,” Reed said.

The newest honorees:

UC Davis Health Information Management Division, group photo
UC Davis Health Information Technology Division award recipients: From left, Stefan Toma, project manager, information technology (IT) departmental services group; Katie Holland, associate business partner, IT human resources (Holland recently joined UC Davis Health human resources, as a manager); Myrene Abot, supervisor, IT human resources; Richard Falcon, supervisor, IT client engineering; Charron Andrus, technical project manager, IT enterprise applications; and Daniel Marenco, supervisor, IT interface operations. Not pictured: John Cook, interim chief information officer.

UC Davis Health Information Technology Division —It has worked with UC Davis Health’s Office for Equity, Diversity and Inclusion the last two years to offer diversity and inclusion training to 70 IT supervisors. Management training includes “The Impact of Unconscious Bias on Workplace Teams” and “Understanding Generational Differences” to help improve communication, teamwork and employee engagement. Individual teams are encouraged to arrange their own trainings, say, with speakers from the Harassment and Discrimination Assistance and Prevention Program, or HDAPP. The Office for Equity, Diversity and Inclusion will host four Diversity and Inclusion Dialogues for the IT division to assist in building a culture of lifelong learning in diversity and inclusion.

Editor’s note about the photo caption and award summary above: As originally published, we gave the incorrect title of the unit being honored. It is the UC Davis Health Information Technology Division, as corrected above. We apologize for the error.

Eloy Gutierrez-CCC mugshot
Gutierrez-Montoya

School of Medicine Postbaccalaureate ProgramThis is a one-year program designed to help educationally and/or socio-economically disadvantaged students become more competitive applicants to medical school. The program partners with the Office of Campus Community Relations for sessions on unpacking oppression, microaggressions and stereotype threat, and weaves these topics into conversations about understanding diversity, and to further develop students’ critical thinking skills. The Postbaccalaureate Program participates in the Campus Community Book Project to further inform students’ understanding of equity issues and how they translate to the health care fields. Elio A. Gutierrez, program coordinator, accepted the award, which also recognized Jose A. Morfin of the Department of Nephrology.

Student Housing and Dining Services, group photo
Student Housing and Dining Services: From left, Rahim Reed, associate executive vice chancellor, Office of Campus Community Relations; Corey Pope, assistant director; Catrina Wagner, associate director; Duane Lindsay, executive project analyst; Connie Quintero, training coordinator; Emily Galindo, associate vice chancellor, auxiliary services; and Gary S. May, chancellor.

Student Housing and Dining ServicesAll leads and managers undergo professional development training on “Understanding Diversity,” “Anti-Bullying,” Cross-Cultural Communication” and “Conflict Management,” all meant to encourage staff to live and practice the Principles of Community at work, among colleagues, and with the campus community members they serve. Student Housing and Dining Services also ensures that their student staff, especially those who work in advising capacities, are exposed to the Campus Community Book Project, integrating the chosen book as part of student staff training.

Photos by Linda Mijangos/Office of Campus Community Relations

Follow Dateline UC Davis on Twitter.

WP Facebook Auto Publish Powered By : XYZScripts.com