China’s massive investment in artificial intelligence has an insidious downside – Science Magazine

Some firms in China now use artificial intelligence–powered facial recognition programs to confirm identities.


BEIJING—In a gleaming high-rise here in northern Beijing’s Haidian district, two hardware jocks in their 20s are testing new computer chips that might someday make smartphones, robots, and autonomous vehicles truly intelligent. A wiry young man in an untucked plaid flannel shirt watches appraisingly. The onlooker, Chen Yunji, a 34-year-old computer scientist and founding technical adviser of Cambricon Technologies here, explains that traditional processors, designed decades before the recent tsunami of artificial intelligence (AI) research, “are slow and energy inefficient” at processing the reams of data required for AI. “Even if you have a very good algorithm or application,” he says, its usefulness in everyday life is limited if you can’t run it on your phone, car, or appliance. “Our goal is to change all lives.”

In 2012, the seminal Google Brain project required 16,000 microprocessor cores to run algorithms capable of learning to identify a cat. The feat was hailed as a breakthrough in deep learning: crunching vast training data sets to find patterns without guidance from a human programmer. A year later, Yunji and his brother, Chen Tianshi, who is now Cambricon’s CEO, teamed up to design a novel chip architecture that could enable portable consumer devices to rival that feat—making them capable of recognizing faces, navigating roads, translating languages, spotting useful information, or identifying “fake news.”

Developers hope artificial intelligence–optimized chips like the Cambricon-1A will enable mobile devices to learn on their own


Tech companies and computer science departments around the world are now pursuing AI-optimized chips, so central to the future of the technology industry that last October Sundar Pichai, CEO of Google in Mountain View, California, told The Verge that his guiding question today is: “How do we apply AI to rethink our products?” The Chen brothers are by all accounts among the leaders; their Cambricon-1A chip made its commercial debut last fall in a Huawei smartphone billed as the world’s first “real AI phone.” “The Chen brothers are pioneering in terms of specialized chip architecture,” says Qiang Yang, a computer scientist at Hong Kong University of Science and Technology (HKUST) in China.

Such groundbreaking advances far from Silicon Valley were hard to imagine only a few years ago. “China has lagged behind the U.S. in cutting-edge hardware design,” says Paul Triolo, an analyst at the Eurasia Group in Washington, D.C. “But it wants to win the AI chip race.” The country is investing massively in the entire field of AI, from chips to algorithms. The Chen brothers, for example, developed their chip while working at the Institute of Computing Technology of the Chinese Academy of Sciences here, and the academy backed them with seed funding when they spun out Cambricon in 2016. (The company is now worth $1 billion.)

Last summer, China’s State Council issued an ambitious policy blueprint calling for the nation to become “the world’s primary AI innovation center” by 2030, by which time, it forecast, the country’s AI industry could be worth $150 billion. “China is investing heavily in all aspects of information technology,” from quantum computing to chip design, says Raj Reddy, a Turing Award–winning AI pioneer at Stanford University in Palo Alto, California, and Carnegie Mellon University in Pittsburgh, Pennsylvania. “AI stands on top of all these things.”

In recent months, the central government and Chinese industry have been launching AI initiatives one after another. In one of the latest moves, China will build a $2.1 billion AI technology park in Beijing’s western suburbs, the state news service Xinhua reported last month. Whether that windfall will pay off for the AI industry may not be clear for years. But the brute numbers are tilting in China’s favor: The U.S. government’s total spending on unclassified AI programs in 2016 was about $1.2 billion, according to In-Q-Tel, a research arm of the U.S. intelligence community. Reddy worries that the United States is losing ground. “We used to be the big kahuna in research funding and advances.”

Closing the intelligence gap

The United States leads China in private investment in artificial intelligence (AI) and in the number and experience of its scientists. But Chinese firms may gain an advantage from having more data—including data not in the public domain—for honing algorithms.

United States China
Years experience of the nation’s data scientists More than half have more than 10 years. Forty percent have less than 5 years.
AI patent applications, 2010–2014 15,317 (First in world) 8410 (Second)
Number of workers in AI positions 850,000 (First) 50,000 (Seventh)
Percent of private AI investment (2016) 66% (First) 17% (Second)
Global ranking of data openness No. 8 No. 93

China’s advantages in AI go beyond government commitment. Because of its sheer size, vibrant online commerce and social networks, and scant privacy protections, the country is awash in data, the lifeblood of deep learning systems. The fact that AI is a young field also works in China’s favor, argues Chen Yunji, by encouraging a burgeoning academic effort that has put China within striking distance of the United States, long the leader in AI research. “For traditional scientific fields, Chinese [scientists] have a long way to go to compete with the U.S. or Europe. But for computer science, it’s a relatively new thing. Young people can compete. Chinese can compete.” In an editorial last week in The Boston Globe, Eric Lander, president of the Broad Institute in Cambridge, Massachusetts, warned that the United States has at best a 6-month lead over China in AI. “China played no role in launching the AI revolution, but is making breathtaking progress catching up,” he wrote.

The fierce global competition in AI has downsides. University computer science departments are hollowing out as companies poach top talent. “Trends come and go, but this is the biggest one I’ve ever seen—a professor can go into industry to make $500,000 to $1 million” a year in the United States or China, says Michael Brown, a computer scientist at York University in Toronto, Canada.

In a more insidious downside, nations are seeking to harness AI advances for surveillance and censorship, and for military purposes. China’s military “is funding the development of new AI-driven capabilities” in battlefield decision-making and autonomous weaponry, says Elsa Kania, a fellow at the Center for a New American Security in Washington, D.C. In the field of AI in China, she warned in a recent report, “The boundaries between civilian and military research and development tend to become blurred.”

The Chinese government has begun using facial scans to identify pedestrians and jaywalkers.


Just as oil fueled the industrial age, data are fueling advances of the AI age. Many practical AI advances are “more about having a large amount of continually refreshed data and good-enough AI researchers who can make use of that data, rather than some brilliant AI theoretician who doesn’t have as much data,” says computer scientist Kai-Fu Lee, founder of Sinovation Ventures, a venture capital firm here. And China, as The Economist recently put it, is “the Saudi Arabia of data.”

Every time someone enters a search query into Baidu (China’s Google), pays a restaurant tab with WeChat wallet, shops on Taobao (China’s Amazon), or catches a ride with Didi (China’s Uber), among a plethora of possibilities, those user data can be fed back into algorithms to improve their accuracy. A similar phenomenon is happening in the United States, but China now has 751 million people online, and more than 95% of them access the internet using mobile devices, according to the China Internet Network Information Center. In 2016, Chinese mobile payment transactions totaled $5.5 trillion, about 50 times more than in the United States that year, estimates iResearch, a consulting firm in Shanghai, China.

Baidu, which runs China’s dominant search engine, both gathers and exploits much of these data. In parking garages under its futuristic glass-and-steel complex in northern Beijing, cars crowned with LIDAR sensors troll around on test runs for collecting mapping data that will feed Baidu’s autonomous driving lab. In the main lobby, staffers’ faces are scanned to open the security gates. Of China’s tech titans—Baidu, Alibaba, and Tencent—Baidu was the first to pour resources into AI. It now employs more than 2000 AI researchers, including staff in California and Seattle, Washington.

A few years ago, Baidu added an AI-powered image search to its mobile app, allowing a user to snap a photo of a piece of merchandise for the search engine to identify, and then look up price and store information.

Early object recognition programs focused on outlines. But many objects—for example, plates of food in a restaurant—have basically the same outline. What’s needed is more precise detection of interior patterns, or “textures,” says Feng Zhou, a data scientist in Cupertino, California, who heads Baidu’s new Fine-Grained Image Recognition Lab. Now, Baidu’s AI image search can distinguish between, for instance, a stewed tofu dish called mapo tofu and fried tofu dishes. (A U.S. equivalent might be detecting the difference between oatmeal and rice.) Better algorithms have helped, Zhou says, but so has an abundance of training data uploaded by internet users.

The data deluge is also transforming academia. “When the AI textbooks were written, we didn’t have access to that kind of data,” Yang says. “About 5 years ago, we decided that classroom education was not sufficient. We needed to have partnerships with industry, because the big technology companies not only have lots and lots of data, but also a variety of data sources and many interesting contexts to apply AI.” Today, a group of HKUST professors and Ph.D. students work on AI projects with Tencent, China’s social media giant. They have access to data from WeChat, the company’s ubiquitous social network, and are developing “intelligent” chat capabilities for everything from customer service to Buddhist spiritual advice.

Such collaborations are vulnerable, however, as China’s academic outposts struggle to keep faculty members capable of designing new AI algorithms from decamping to industry. “University students know that AI is a very cool thing, which might also make you rich,” Chen Yunji says.

The Chinese government is also drinking from the data firehose—and is honing AI as a tool for staying in power. The State Council’s AI road map explicitly acknowledges AI’s importance to “significantly elevate the capability and level of social governance, playing an irreplaceable role in effectively maintaining social stability.”

Some worry that the government’s embrace of AI could further stifle dissent in China. Enhanced technology for recognizing context and images allows for more effective real-time censorship of online communications, according to a report from The Citizen Lab, a research outfit at the University of Toronto. Also at the heart of this debate is facial recognition technology, which is powered by AI algorithms that analyze minute details of a person’s face in order to pick it out from among thousands or millions of potential matches.

People in China can now use facial scans to authorize digital payments at some fast food restaurants.


Facial recognition is now used routinely in China for shopping and to access some public services. For example, at a growing number of Kentucky Fried Chicken restaurants in China, customers can authorize digital payment by facial scan. Baidu’s facial recognition systems confirm passenger identity at certain airport security gates. Recent AI advances have made it possible to identify individuals not only in up-close still photos, but also in video—a far more complex scientific task.

China’s attitude toward such advances contrasts with the U.S. response. When the U.S. Customs and Border Protection last May revealed plans to use facial matching to verify the identities of travelers on select flights leaving the United States, a public debate erupted. In an analysis, Jay Stanley of the American Civil Liberties Union in Washington, D.C., warned of the potential for “mission creep”: With new AI technologies, “you can subject thousands of people an hour to face recognition when they’re walking down the sidewalk without their knowledge, let alone permission or participation.”

In China the government is already deploying facial recognition technology in Xinjiang, a Muslim-majority region in western China where tensions between ethnic groups erupted in deadly riots in 2009. Reporters from The Wall Street Journal who visited the region late last year found surveillance cameras installed every hundred meters or so in several cities, and they noted facial recognition checkpoints at gas stations, shopping centers, mosque entrances, and elsewhere. “This is the kind of thing that makes people in the West have nightmares about AI and society,” says Subbarao Kambhampati, president of the Association for the Advancement of Artificial Intelligence (AAAI) in Palo Alto and a computer scientist at Arizona State University in Tempe. In China, he says, “people are either not worried, or not able to have those kinds of conversations.”

Even toilet paper in public restrooms is now being dispensed, in limited amounts, after a facial scan.


China’s AI researchers show no signs of slowing down. In October 2016, a White House report found that Chinese researchers now publish more deep learning–related papers in all journals than researchers from any other country. When adjusted for publication impact factor, the United States still produced the most influential AI-related papers, followed by the United Kingdom, with China only narrowly behind, according to a recent McKinsey Global Institute analysis.

Kambhampati adds that before 2012 or so, submissions from China to major AI conferences “used to be quite small.” At AAAI’s annual meeting earlier this week in New Orleans, Louisiana, he says, accepted papers from China nearly equaled those from the United States. “For the longest time, there was a general feeling that China was always second-rate in technology. That may have been true, but it’s also changing quite quickly.”

The government wants the boom to continue. At the end of 2017, the science ministry issued a 3-year plan to guide AI development, and named several large companies as “national champions” in key fields: for example, Baidu in autonomous driving, and Tencent in computer vision for medical diagnosis. Zha Hongbin, a professor of machine intelligence at Peking University here who consults for the government, says China plans to expand the number of universities offering dedicated machine learning and AI departments.

In the meantime, industry continues to bet heavily on AI. Last October, for instance, Alibaba announced plans to invest $15 billion in research over 3 years to build seven labs in four countries that will focus on quantum computing and AI.

A decade ago, China’s best AI researchers might have left for plum jobs in Silicon Valley. Instead, increasing numbers of them are staying at home to lift the nation’s AI industry, says Xia Yan, a 30-year-old data scientist who co-founded Momenta, an autonomous driving startup here. “Many of us are choosing to go from an academic background to running a company,” Xia says. “We want to see our work in the real world. It’s a new era.”

Evolving Financial Models to Support HIT Investments: One Financing Expert’s View – Healthcare Informatics

Click To View Gallery

With an ongoing digital transformation taking place across healthcare, health system executive leaders are increasingly investing in IT and innovative technologies to meet clinical and operational goals.

Yet, at the same time a survey by the American College of Healthcare Executives found that healthcare CEOs cited financial challenges as their number one concern. Healthcare executive leaders are challenged with financing IT even while health systems continue to feel mounting financial pressure.

According to data from the Englewood, Col.-based Medical Group Management Association (MGMA), IT expenses for physician practices are on a slow and steady rise. Last year for example, physician-owned practices spent between nearly $2,000 to $4,000 more per full-time physician on IT operating expenses than they did the prior year. Total IT expenses per physician last year fell between $14,000 to $19,000, dependent upon specialty, according to MGMA.

What’s more, a 2016 cost and revenue report from MGMA found that physician-owned multispecialty practices spent more than $32,500 per full-time physician on information technology equipment, staff, maintenance, and other related expenses. In addition, technology costs have grown by more than 40 percent since 2009. Other trends in the healthcare industry, such as practices investing in online patient portals, have also contributed to increased technology costs.

Healthcare Informatics Associate Editor Heather Landi recently spoke with Gary Amos, CEO of Commercial Finance, North America, at Siemens Financial Services (SFS), about financing healthcare IT. Amos, who is based in the Philadelphia area, has been with the organization for 11 years. SFS finances both technology and healthcare equipment for Siemens Healthineers and other leading healthcare providers. Below are excerpts from that interview.


CQL in the Cloud: How to Benefit from the New CMS-Required Language

Centers for Medicare and Medicaid Services (CMS) will require the use of Clinical Quality Language (CQL) for electronic clinical quality measures (eCQMs) reporting in 2019. But what is CQL? And…

How do you see the landscape around the financing of capital acquisitions in healthcare at this time?

There’s a couple of ways to approach it, and I recommend we extend our perspective beyond IT. I think from what we see in the market and where the digital transformation is driving healthcare you need to view it along the entire healthcare continuum—from the experience of the patient, to healthcare provider and finally from the viewpoint of a financial expert.

First, let’s view it from the consumer perspective. A technology transformation has patients relying on mobile apps, seeking information online and becoming more engaged and proactive in managing personal health. Physicians and providers who can offer their patients further customized and automated diagnoses are at a competitive advantage for patient retention. Providers who adopt new digital technologies and equipment are able to further automate and connect patient data across larger healthcare IT networks. This enables providers to manage data smarter and provide stronger diagnoses for patients, increasing speed, efficiency and leading to higher patient satisfaction.

I think from the provider standpoint, we’re currently evaluating different financial models and seeing how they can enable desired outcomes across a wide array of scenarios. You hear a lot in the market right now about MES or managed equipment services. It’s no longer about how we finance a single asset. The conversation is shifting to how we are enabling larger projects that include not only the diagnostic equipment required, but involves services and performance-based metrics that allow for technology evolution and planning cycles over a longer period of time. The new demand for capital is in financing a bundled package with a commitment to a level of service in a formalized agreement with underlying performance metrics in place. Today’s healthcare providers require a return on investment with tighter budgets and being tasked to do more with less. That’s why bundled services that can promise specific outcomes are highly desired by today’s providers.

Now from the financial expert’s perspective, we are helping providers explore financing options that extend beyond a short-term goal. It’s about looking at needs over a longer planning horizon, determining the right equipment to support those needs and how we can structure the financing to improve equipment performance and consider asset longevity as the demands for digital technology evolves.

Healthcare organizations continue to face financial pressures. What are some financial techniques that healthcare organizations can use to meet today’s digital demands?

Healthcare digitalization, the collection and electronic exchange of vital biological and clinical data, helps organizations gain maximum value from new digital capabilities. In today’s industry, health systems will need to integrate digital tools and technologies into core processes.

According to the Centers for Medicare and Medicaid Services (CMS), U.S. healthcare spending grew 4.3 percent in 2016, and as a share of gross domestic product, it accounted for nearly 18 percent of U.S. spending. Though healthcare spending is up, budgets are still tight and the challenge is increasingly becoming how we can provide a more precise diagnosis to foster individualized prevention and therapy. In addition, how do we reduce the time frame of diagnosis and treatment and improve the patient experience across the continuum of care? In the past, your treatment or your protocol might have run a course of a number of months. Providers who can reduce the amount of time that’s required to treat or prevent illness will find themselves in a stronger financial position. Reimbursement of resources and capitation payments are driving the headwinds for hospitals, physicians and outpatient centers.

For example, when a primary care provider signs a capitation agreement, a list of specific services for patients must be included in the contract. The amount of the capitation will be determined in part by the number of services provided and will vary from health plan to health plan, but most capitation payment plans for primary care services include preventive, diagnostic, and treatment services, such as injections, immunizations, and medications administered in the office, outpatient laboratory tests done either in the office or at a designated laboratory, health education and counseling services performed in the office, and routine vision and hearing screening.

It is not unusual for large groups or physicians involved in primary care network models to also receive an additional capitation payment for diagnostic test referrals and subspecialty care. Through healthcare providers adopting such plans, managed care organizations can control healthcare costs and hold their physicians accountable to receive improved services.

What should healthcare CIOs and CTOs be thinking about right now?

I think from a CTO/CIO standpoint, a lot of what happened in the past is they were focused on EMRs (electronic medical records) and as those platforms became stable that allowed for the evolution of a more digitalized age. You were no longer moving patient records in a manila folder from doctor to doctor. It is now being moved through online platforms, mobile devices and being provided to your doctor with a holistic view of the patient’s records and data. Today’s executives need to be concerned with adopting digital technology and equipment that integrates data exchange and enables population health management. For example, if a clinician has a broader view of health patterns and trends across patients, it helps them to assess needs and transform care delivery models to improve the patient experience. Transforming care delivery is about leveraging established and new care models to provide more accessible and highly efficient healthcare offerings. For a leader in today’s healthcare environment the focus should be on digitalizing healthcare processes, expanding precision medicine, transforming care delivery and improving the patient experience.

With the overall trends in healthcare right now—population health, the transition to value-based care, and all the new regulations—how will this impact the financing of healthcare IT in the next few years?

As the country works to adapt to healthcare demands, private financing is uniquely positioned to take a leading role in supporting today’s digital market shift. An aging population, chronic conditions rising, and structural changes from the Affordable Care Act (ACA) impose many financial pressures on healthcare providers. Complex, clinical procedures are on the rise, but investments in technology can help make these procedures simpler. In order to meet consumer demands and keep U.S. healthcare infrastructure, technology and services modernized, the healthcare sector requires some serious investments. With today’s digital transformation overhauling healthcare, this is where private funding sources can step in to help by enabling organizations to keep pace through updated IT infrastructure.

And, again, you’re seeing financial models evolving as a result of all this, is that right?

Whether it’s a large institutional-type hospital or a smaller-scale physician owned practice, everyone will have a call to action to try to transform their business and operational model, using the technology that’s available. Some of the traditional financing products, such as loans or equipment leases, will remain but could take shape or form into different structures. Unitary payment models where there is a more holistic approach to healthcare management and financing will drive the digital transformation. Coupling payment models for equipment and services together will continue to be challenged and the unitary structures will move to the forefront of discussion.

The digital transformation of healthcare technology, through connecting patient data across greater IT networks, will require financial models to evolve with the acceleration of technological advancements. As healthcare technology becomes more automated, service and delivery methods will become more patient-centric than ever before. Financial models will enable healthcare providers to accomplish their clinical and operational goals through the adoption of digitized information technology.

The 40 Best Workplaces in Technology – Fortune

What is KM? Knowledge Management Explained – KMWorld Magazine

The classic one-line definition of Knowledge Management was offered up by Tom Davenport early on (Davenport, 1994): “Knowledge Management is the process of capturing, distributing, and effectively using knowledge.” Probably no better or more succinct single-line definition has appeared since.

However, Knowledge Management can best and most quickly be explained by recapping its origins. Later in this article, its stages of development will also be recapped.

The Origins of KM

The concept and the terminology of KM sprouted within the management consulting community. When the Internet arose, those organizations quickly realized that an intranet, an in-house subset of the Internet, was a wonderful tool with which to make information accessible and to share it among the geographically dispersed units of their organizations. Not surprisingly, they quickly realized that in building tools and techniques such as dashboards, expertise locators, and best practice (lessons learned) databases, they had acquired an expertise which was in effect a new product that they could market to other organizations, particularly to organizations which were large, complex, and dispersed. However, a new product needs a name, and the name that emerged was Knowledge Management. The term apparently was first used in its current context at McKinsey in 1987 for an internal study on their information handling and utilization (McInerney and Koenig, 2011). KM went public, as it were, at a conference in Boston in 1993 organized by Ernst and Young (Prusak 1999). Note that Davenport was at E&Y when he wrote the definition above.

Those consulting organizations quickly disseminated the principles and the techniques of KM to other organizations, to professional associations, and to disciplines. The timing was propitious, as the enthusiasm for intellectual capital (see below) in the 1980s, had primed the pump for the recognition of information and knowledge as essential assets for any organization.

What is KM trying to accomplish?

Rich, Deep, and Open Communication

First, KM can very fruitfully be seen as the undertaking to replicate, indeed to create, the information environment known to be conducive to successful R&D—rich, deep, and open communication and information access—and to deploy it broadly across the firm. It is almost trite now to observe that we are in the post-industrial information age and that we are all information workers. Furthermore, the researcher is, after all, the quintessential information worker. Peter Drucker once commented that the product of the pharmaceutical industry wasn’t pills, it was information. The research domain, and in particular the pharmaceutical industry, has been studied in depth with a focus on identifying the organizational and cultural environmental aspects that lead to successful research (Koenig, 1990, 1992). The salient aspect that emerges with overwhelming importance is that of rich, deep, and open communications, not only within the firm, but also with the outside world. The logical conclusion, then, is to attempt to apply those same successful environmental aspects to knowledge workers at large, and that is precisely what KM attempts to do.

Situational Awareness

Second, Situational Awareness is a term only recently, beginning in 2015, used in the context of KM. The term, however, long precedes KM. It first gained some prominence in the cold war era when studies were commissioned by all of the major potential belligerents to try to identify what characteristics made a good fighter pilot. The costs of training a fighter pilot were huge, and if the appropriate characteristics leading to success could be identified, that training could be directed to the most appropriate candidates, and of those trained the most appropriate could be selected for front-line assignment. However, the only solid conclusion of those studies was that the salient characteristic of a good fighter pilot was excellent “situational awareness.” The problem was that no good predictive test for situational awareness could be developed.

The phrase then retreated into relative obscurity until it was resuscitated by Jeff Cooper, a firearms guru, and others in the context of self-defense. How do you defend and protect yourself? The first step is to be alert and to establish good situational awareness. From there the phrase entered the KM vocabulary. The role of KM is to create the capability for the organization to establish excellent situational awareness and consequently to make the right decisions.

A new definition of KM

A few years after the Davenport definition, the Gartner Group created another definition of KM, which has become the most frequently cited one (Duhon, 1998), and it is given below:

“Knowledge management is a discipline that promotes an integrated approach to identifying, capturing, evaluating, retrieving, and sharing all of an enterprise’s information assets. These assets may include databases, documents, policies, procedures, and previously un-captured expertise and experience in individual workers.”

The one real lacuna of this definition is that it, too, is specifically limited to an organization’s own information and knowledge assets. KM as conceived now, and this expansion arrived early on, includes relevant information assets from wherever relevant. Note, however, the breadth implied for KM by calling it a “discipline.”

Both definitions share a very organizational and corporate orientation. KM, historically at least, was primarily about managing the knowledge of and in organizations. Rather quickly, however, the concept of KM became much broader than that.

A graphic map of Knowledge Management

What is still probably the best graphic to try to set forth what constitutes KM, is the graphic developed by IBM for the use of their own KM consultants. It is based upon the distinction between collecting stuff (content) and connecting people. The presentation here includes some minor modifications, but the captivating C, E, and H mnemonics are entirely IBM’s:

Graphic Map of KM









  • Databases, external & internal
  • Content Architecture
  • Information Service Support (training required)
  • data mining best practices / lessons learned/after action analysis


  • community & learning
  • directories, “yellow pages” (expertise locators)
  • findings & facilitating tools, groupware
  • response teams





  • Cultural support
  • current awareness profiles and databases
  • selection of items for alerting purposes / push
  • data mining best practices


  • Cultural support
  • spaces – libraries & lounges (literal & virtual), cultural support, groupware
  • travel & meeting attendance


From: Tom Short, Senior consultant, Knowledge Management, IBM Global Services

(Note however the comments below under “Tacit.”)

OK, what does KM actually consist of?

In short, what are the operational components of a KM system? This is, in a way, the most straightforward way of explaining what KM is—to delineate what the operational components are that constitute what people have in mind when they talk about a KM system.

(1) Content Management

So what is involved in KM? The most obvious is the making of the organization’s data and information available to the members of the organization through dashboards, portals, and with the use of content management systems. Content Management, sometimes known as Enterprise Content Management, is the most immediate and obvious part of KM. For a wonderful graphic snapshot of the content management domain go to and look at their Content Technology Vendor Map. This aspect of KM might be described as Librarianship 101, putting your organization’s information and data up online, plus selected external information, and providing the capability to seamlessly shift to searching, more or less, the entire web. The term most often used for this is Enterprise Search. This is now not just a stream within the annual KMWorld Conference, but has become an overlapping conference in its own right. See the comments below under the “Third Stage of KM” section.

(2) Expertise Location

Since knowledge resides in people, often the best way to acquire the expertise that you need is to talk with an expert. Locating the right expert with the knowledge that you need, though, can be a problem, particularly if, for example, the expert is in another country. The basic function of an expertise locator system is straightforward: it is to identify and locate those persons within an organization who have expertise in a particular area. These systems are now commonly known as expertise location systems. In the early days of KM the term ‘Yellow Pages” was commonly used, but now that term is fast disappearing from our common vocabulary, and expertise location is, in any case, rather more precise.

There are typically three sources from which to supply data for an expertise locator system: (1) employee resumes, (2) employee self-identification of areas of expertise (typically by being requested to fill out a form online), and (3) algorithmic analysis of electronic communications from and to the employee. The latter approach is typically based on email traffic but can include other social networking communications such as Twitter, Facebook, and Linkedin. Several commercial software packages to match queries with expertise are available. Most of them have load-balancing schemes so as not to overload any particular expert. Typically such systems rank the degree of presumed expertise and will shift a query down the expertise ranking when the higher choices appear to be overloaded. Such systems also often have a feature by which the requester can flag the request as a priority, and the system can then match high priority to high expertise rank.

(3) Lessons Learned

Lessons Learned databases are databases that attempt to capture and make accessible knowledge, typically “how to do it” knowledge, that has been operationally obtained and normally would not have been explicitly captured. In the KM context, the emphasis is upon capturing knowledge embedded in personal expertise and making it explicit. The lessons learned concept or practice is one that might be described as having been birthed by KM, as there is very little in the way of a direct antecedent. Early in the KM movement, the phrase most often used was “best practices,” but that phrase was soon replaced with “lessons learned.” The reasons were that “lessons learned” was a broader and more inclusive term and because “best practice” seemed too restrictive and could be interpreted as meaning there was only one best practice in a situation. What might be a best practice in North American culture, for example, might well not be a best practice in another culture. The major international consulting firms were very aware of this and led the movement to substitute the new more appropriate term. “Lessons Learned” became the most common hallmark phrase of early KM development.

The idea of capturing expertise, particularly hard-won expertise, is not a new idea. One antecedent to KM that we have all seen portrayed was the World War II debriefing of pilots after a mission. Gathering military intelligence was the primary purpose, but a clear and recognized secondary purpose was to identify lessons learned, though they were not so named, to pass on to other pilots and instructors. Similarly, the U. S. Navy Submarine Service, after a very embarrassing and lengthy experience of torpedoes that failed to detonate on target, and an even more embarrassing failure to follow up on consistent reports by submarine captains of torpedo detonation failure, instituted a mandatory system of widely disseminated “Captain’s Patrol Reports.” The intent, of course, was to avoid any such fiasco in the future. The Captain’s Patrol Reports, however, were very clearly designed to encourage analytical reporting, with reasoned analyses of the reasons for operational failure and success. It was emphasized that a key purpose of the report was both to make recommendations about strategy for senior officers to mull over, and recommendations about tactics for other skippers and submariners to take advantage of (McInerney and Koenig, 2011).

The military has become an avid proponent of the lessons learned concept. The phrase the military uses is “After Action Reports.” The concept is very simple: make sure that what has been learned from experience is passed on, and don’t rely on the participant to make a report. There will almost always be too many things immediately demanding that person’s attention after an action. There must be a system whereby someone, typically someone in KM, is assigned the responsibility to do the debriefing, to separate the wheat from the chaff, to create the report, and then to ensure that the lessons learned are captured and disseminated. The experiences in Iraq, Afghanistan, and Syria have made this process almost automatic in the military.

The concept is by no means limited to the military. Larry Prusak (2004) maintains that in the corporate world the most common cause of KM implementation failure is that so often the project team is disbanded and the team members almost immediately reassigned elsewhere before there is any debriefing or after-action report assembled. Any organization where work is often centered on projects or teams needs to pay very close attention to this issue and set up an after-action mechanism with clearly delineated responsibility for its implementation.

A particularly instructive example of a “lesson learned” is one recounted by Mark Mazzie (2003), a well known KM consultant. The story comes from his experience in the KM department at Wyeth Pharmaceuticals. Wyeth had recently introduced a new pharmaceutical agent intended primarily for pediatric use. Wyeth expected it to be a notable success because, unlike its morning, noon, and night competitors, it needed to be administered only once a day, and that would make it much easier for the caregiver to ensure that the child followed the drug regimen, and it would be less onerous for the child. Sales of the drug commenced well but soon flagged. One sales rep (what the pharmaceutical industry used to call detail men), however, by chatting with her customers, discovered the reason for the disappointing sales and also recognized the solution. The problem was that kids objected strenuously to the taste of the drug, and caregivers were reporting to prescribing physicians that they couldn’t get their kid to continue taking the drug, so the old stand-by would be substituted. The simple solution was orange juice, a swig of which quite effectively masked the offensive taste. If the sales rep were to explain to the physician that the therapy should be conveyed to the caregiver as the pill and a glass of orange juice taken simultaneously at breakfast, then there was no dissatisfaction and sales were fine.

The obvious question that arises is what is there to encourage the sales rep to share this knowledge? The sales rep is compensated based on salary (small), and bonus (large). If she shares the knowledge, she jeopardizes the size of her bonus, which is based on her comparative performance.

This raises the issue, discussed below, that KM is much more than content management. It extends to how does one structures the organizational culture to facilitate and encourage knowledge sharing, and that extends to how one structures the organization’s compensation scheme.

The implementation of a lessons learned system is complex both politically and operationally. Many of the questions surrounding such a system are difficult to answer. Are employees free to submit to the system un-vetted? Who, if anyone, is to decide what constitutes a worthwhile lesson learned? Most successful lessons learned implementations have concluded that such a system needs to be monitored and that there needs to be a vetting and approval mechanism for items that are posted as lessons learned.

How long do items stay in the system? Who decides when an item is no longer salient and timely? Most successful lessons learned systems have an active weeding or stratification process. Without a clearly designed process for weeding, the proportion of new and crisp items inevitably declines, the system begins to look stale, and usage and utility falls. Deletion, of course, is not necessarily loss and destruction. Using carefully designed stratification principles, items removed from the foreground can be archived and moved to the background but still made available. However, this procedure needs to be in place before things start to look stale, and a good taxonomically based retrieval system needs to be created.

These questions need to be carefully thought out and resolved, and the mechanisms designed and put in place, before a lessons-learned system is launched. Inattention can easily lead to failure and the creation of a bad reputation that will tar subsequent efforts.

(4) Communities of Practice (CoPs)

CoPs are groups of individuals with shared interests that come together in person or virtually to tell stories, to share and discuss problems and opportunities, discuss best practices, and talk over lessons learned (Wenger, 1998; Wenger & Snyder, 1999). Communities of practice emphasize, build upon, and take advantage of the social nature of learning within or across organizations. In small organizations, conversations around the water cooler are often taken for granted, but in larger, geographically distributed organizations, the water cooler needs to become virtual. Similarly, organizations find that when workers relinquish a dedicated company office to work online from home or on the road, the natural knowledge sharing that occurs in social spaces needs to be replicated virtually. In the context of KM, CoPs are generally understood to mean electronically linked communities. Electronic linkage is not essential, of course, but since KM arose in the consulting community from the awareness of the potential of intranets to link geographically dispersed organizations, this orientation is understandable.

A classic example of the deployment of CoPs comes from the World Bank. When James Wolfensohn became president in 1995, he focused on the World Bank’s role in disseminating knowledge about development; he was known to say that the principal product of the World Bank was not loans, but rather the creation of knowledge about how to accomplish development. Consequently, he encouraged the development of CoPs and made that a focus of his attention. One World Bank CoP, for example, was about road construction and maintenance in arid countries and conditions. That CoP was encouraged to include and seek out not only participants and employees from the World Bank and its sponsored projects and from the country where the relevant project was being implemented, but also experts from elsewhere who had expertise in building roads in arid conditions, such as, for example, staff from the Australian Road Research Board and the Arizona Department of Highways. This is also a good example of the point that despite the fact that KM developed first in a very for-profit corporate context, it is applicable far more broadly, such as in the context of government and civil society.

The organization and maintenance of CoPs is not a simple or an easy task to undertake. As Durham (2004) points out, there are several key roles to be filled. She describes the key roles as manager, moderator, and thought leader. They need not necessarily be three separate people, but in some cases they will need to be. Some questions that need to be thought about and resolved are:

  • Who fills the various roles of: manager, moderator, and thought leader?
  • How is the CoP managed, and who will fill the management role?
  • Who will have overall responsibility for coordinating and overseeing the various CoPs?
  • Who looks for new members or suggests that the CoP may have outlived its usefulness?
  • Who reviews the CoP for activity?
  • Are postings open or does someone vet or edit the postings?
  • How is the CoP kept fresh and vital?
  • When and how (under what rules) are items removed?
  • How are those items archived?
  • How are the CoP files made retrievable? How do CoP leaders coordinate with the enterprise search/taxonomy function?

Another way to view KM is to look at the stages of KM’s Development

First Stage of KM: Information Technology

KM was initially driven primarily by IT, information technology, and the desire to put that new technology, the Internet, to work and see what it was capable of. That first stage has been described using a horse breeding metaphor as “by the internet out of intellectual capital,” the sire and the dam. The concept of intellectual capital, the notion that not just physical resources, capital, and manpower, but also intellectual capital (knowledge) fueled growth and development, provided the justification, the framework, and the seed. The availability of the internet provided the tool. As described above, the management consulting community jumped at the new capabilities provided by the Internet, using it first for themselves, realizing that if they shared knowledge across their organization more effectively they could avoid reinventing the wheel, underbid their competitors, and make more profit. The central point is that the first stage of KM was about how to deploy that new technology to accomplish more effective use of information and knowledge.

The first stage might be described as the “If only Texas Instruments knew what Texas Instruments knew” stage, to revisit a much quoted KM mantra. The hallmark phrase of Stage 1 was first “best practices,” later replaced by the more politic “lessons learned.”

Second Stage of KM: HR and Corporate Culture

Within a few years the second stage of KM emerged when it became apparent that simply deploying new technology was not sufficient to effectively enable information and knowledge sharing. It became obvious that human and cultural dimensions needed to be incorporated. The second stage can be described as the “‘If you build it they will come’ is a fallacy” stage. In other words, there was the recognition that “If you build it they will come” is a recipe that can easily lead to quick and embarrassing failure if human factors are not sufficiently taken into account.

It became clear that KM implementation would involve changes in the corporate culture, in many cases rather significant changes. Consider the case above of the new pediatric medicine and the discovery of the efficacy of adding orange juice to the recipe. Pharmaceutical sales reps are compensated primarily not by salary, but by bonuses based on sales results. What is in it for that sales rep to share her new discovery when the most likely result is that next year her bonus would be substantially reduced? The changes needed in corporate culture to facilitate and encourage information and knowledge sharing can be major and profound. KM therefore extends far beyond just structuring information and knowledge and making it more accessible. In particular, the organizational culture needs to be examined in terms of how it rewards information and knowledge sharing. In many cases the examination will reveal that the culture needs to be modified and enriched. Often this will involve examining and modifying how the compensation scheme rewards information and knowledge sharing.

This implies a role for KM that very few information professionals have had to be involved with in the past. The implication is clear that KM leaders should be involved in the decision making process for designing the organization’s compensation policy, a process that is very sensitive politically and fraught with difficulty.

A major component of this second stage was the design of easy-to-use and user-friendly systems. The metaphor that was used was that the interface, the Desktop Interface, should appear almost intuitively obvious, like the dashboard of an automobile. (This was of course before the proliferation of chips in automobiles and the advent of user manuals that were inches thick.) Human factors design became an important component of KM.

As this recognition of the importance of human factors unfolded, two major themes from the business literature were brought into the KM domain. The first of these was Senge’s work on the learning organization (Senge, Peter M., 1990 The Fifth Discipline: The Art and Practice of the Learning Organization.) The second was Nonaka’s work on “tacit” knowledge and how to discover and cultivate it (Nonaka, Ikujiro & Takeuchi, Hirotaka, 1995 The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation.) Both were not only about the human factors of KM implementation and use; they were also about knowledge creation as well as knowledge sharing and communication. The hallmark phrase of Stage 2 was “communities of practice,” CoPs. A good indicator of the shift from the first to the second stage of KM is that for the 1998 Conference Board conference on KM, there was, for the first time, a noticeable contingent of attendees from HR, human resource departments. By the next year, 1999, HR was the largest single group, displacing IT attendees from first place.

Third Stage of KM: Taxonomy and Content Management

The third stage developed from the awareness of the importance of content, and in particular the awareness of the importance of the retrievability of that content, and therefore the importance of the arrangement, description, and the syndetic structure of that content. Since a good alternative description for the second stage of KM is the “it’s no good if they don’t use it” stage, then in that vein, perhaps the best description for the third stage is the “it’s no good if they try to use it but can’t find it” stage. Another bellwether is that TFPL’s report of their October 2001 CKO (Chief Knowledge Officer) Summit reported that for the first time taxonomies emerged as a topic, and it emerged full blown as a major topic (TFPL, 2001 Knowledge Strategies – Corporate Strategies.) The hallmark phrases emerging for the third stage are content management (or enterprise content management) and taxonomies. At the KMWorld 2000 Conference, a track on Content Management appeared for the first time, and by the 2001 KMWorld Conference, Content Management had become the dominant track. In 2006, KMWorld added a two-day workshop entitled Taxonomy Boot Camp, which not only still continues today, and is a day longer, but has also expanded to international locations. The hallmark terms for the third stage of KM are taxonomy and content.

The third stage continues today and is expanding. A major theme now is “data analytics” and “machine learning” for “enterprise search.” The crux is to be able to effectively manage and retrieve your data without maintaining a stable full of taxonomists. A good recent example derives from Rolls Royce’s sale of a subsidiary company. The buyer was entitled to voluminous files of company records, but how was Rolls Royce to separate those from other records with valuable proprietary data that was not part of the sale, and that Rolls Royce wanted to maintain as proprietary knowledge, amidst a sea of structured and unstructured data? The answer was a major project to taxonomize, organize, index, and retrieve massive amounts of data and records. Data analytics and machine learning were powerful tools to help accomplish this, but notice the word “help.” Those tools will not replace the need for good and intelligent human guidance, training, and oversight.

The author is reminded of an occurrence some years ago at Mitre Corporation, a very information driven organization, before the term KM was even bruited about, when the organization wanted to create an expertise location “file/database.” A senior Vice President’s first thought was that just putting the employee’s resumes online would suffice. A short demonstration convinced him otherwise. One sample search was for employees with expertise in “Defense Logistics,” a topic relevant to an RFP that Mitre was bidding on. The clincher was a resume containing neither word, but with the phase “battlefield re-supply.”

A good idea is to browse the KMWorld website ( for current reports, “white papers,” webinars, etc. on topics such as “text analytics” or “big data” or “cognitive computing.”

Yet one more definition/description

The late 20th Century, extending into the 21st Century, was characterized by an almost continuous stream of information and knowledge-related topics and enthusiasms.

Below is a list of those enthusiasms, in roughly chronological order, with the earlier at the top of the list. In some cases, where it is today not so obvious from the name, there is a brief description of what the topic or the enthusiasm consisted of.

  • Minimization of Unallocated Cost—the thesis that for optimal decision making costs should be accurately allocated to products, and as data processing became an ever more important and larger component of an organization’s expenses, it was usually treated as G&A, General and Administrative expenses, and a lower and lower proportion of organizational budgets were clearly allocated to their products, and decision making was suffering.
  • I.T. and Productivity
  • Data Driven System Design—the thesis that the basis of good system design was not the classic if-then-else flow chart, but was based on the charting of procedures and information flow within the organization, and only then came the if-then-else detailed analysis. This drove a dramatically increased recognition of the importance of data and information in systems design and in an organization’s operations in general.
  • Decision Analysis—the addition to the basic systems analysis construct that there is often an unrecognized and unanalyzed option in any decision situation—i.e., to go back and deploy some resources to get better information, and then return and consequently have a better chance of making the optimal decision. Further, one can, in principle, compare the cost of better information with the expected value of a better decision.
  • Information Systems Stage Hypotheses–there was a profusion of stage hypotheses in the late 1970s and early 1980s: by Nolan, Rockart, Marchand, Gibson & Jackson, Koenig, and Zachman. All had ramifications for, and were for the most part publicized for, the implementation and management of I.T.
  • Managing the Archipelago (of Information Services–the thesis that for primarily historical reasons the information handling responsibilities in an organization are usually administratively scattered like the islands in an archipelago, and that this creates severe administrative and managerial problems, and that only very senior management is in a position to get a handle on the problem, and that it needs to very consciously address the issue.
  • I.T. as Competitive Advantage
  • Management Information Systems (MIS) to Decision Support Systems (DSS)–the recognition that the disenchantment with MIS was caused primarily by its focus solely on internal information. and that most management decisions depended more on external information, and that DSSs needed to address the issue of access to external information and knowledge.
  • Enterprise-Wide Information Analysis—this was IBM’s mantra for promotion to their customers’ senior management that, to be successful, an organization 1) had to define what its enterprise really consisted of, and determine what business it was really in; 2) then it had to analyze and define what decisions it had to make correctly to be successful in that enterprise; 3) then it had to analyze what information it needed to make those decisions correctly, and obtain and process that information.
  • Information Resource Management—the concept that information was not only a resource, but was also often a product. The Paperwork Reduction Act mandated that all government agencies appoint a senior administrator in charge of Information Resource Management.
  • I.T. and Organizational Structure
  • Total Quality Management (TQM) and Benchmarking
  • Competitive Intelligence (CI)
  • I.T. and the Shift from Hierarchies to Markets–the that better I.T. inevitably shifts the optimal effectiveness trade-off point toward the market end of the central planning to market economy spectrum.
  • Business Process Re-Engineering
  • Core Competencies
  • Data Warehousing and Data Mining (more recently known as Big Data)
  • E-Business
  • Intellectual Capital
  • Knowledge Management
  • Enterprise Resource Planning (ERP)—the not very obvious name for the idea of integrating all your business’s I.T. operations under one software suite.
  • Customer Relationship Management (CRM)
  • Supply Chain Management (SCM)
  • Enterprise Content Management (ECM)

The list is impressively long, and all these topics and enthusiasms are related to the management of information and knowledge, or the management of information processing functions. It would be very hard to come up with a very much shorter list of management topics and enthusiasms of the same era that were not related to the management of information and knowledge or to the management of information processing functions.

If the list is so long, and they have so major a theme in common, has there not been some recognition that all these trees constitute a forest? Well, there was (Koenig, 2000), and it was called “information driven management,” the name put forward for the “forest” at the time , but it received comparatively little exposure or momentum.

One interesting way to look at KM is that, in fact, KM has expanded to become and is now the recognition of that forest of trees (McInerney and Koenig, 2011), that KM is a much better and more recognized name than “information driven management.” It is interesting that this stream of trees, to mix metaphors, has dwindled dramatically since the appearance of KM as an important topic. It can further be argued that the typical new topic or enthusiasm, the cloud and big data for example, can be seen as emerging from within KM.

Other KM Issues

Tacit Knowledge

The KM community uses the term “tacit knowledge” to mean what is not “explicit knowledge,” and in that usage what is usually meant by “tacit” is implicit knowledge, that which is not explicit or formally captured in some fashion, most obviously the knowledge in people’s heads. A more useful and nuanced categorization is explicit, implicit, and tacit. There is indeed tacit knowledge which only resides in someone’s head. Nonaka uses the story of the tacit knowledge that was necessary to develop a home bread maker. To understand what was needed to design a machine to knead dough properly, it was necessary for the engineers to work with bread makers to get the feel for how the dough needed to be manipulated.

But frankly the extent of knowledge that is truly tacit, like how to get up on water skis, that overlaps with the interests of KM systems is rather small. What is often very extensive is the amount of implicit information that could have been made explicit, but has not been. That it has not been is usually not a failure, but usually simply a cost-effective decision, usually taken unconsciously, that it is not worth the effort. The danger lies in assuming that explicit information is addressed by “collecting” and tacit information by “connecting,” and not examining whether there is potentially important implicit information that could and should be made explicit. The after action comments above under Lessons Learned illustrate this important point.

Knowledge Retention and Retirees

One long standing KM issue is the need to retain the knowledge of retirees. The fact that the baby boomer bulge is now reaching retirement age is making this issue increasingly important. KM techniques are very relevant to this issue. The most obvious technique is the application of the lessons learned idea—just treat the retiree’s career as a long project that is coming to its end and create an after action report, a massive data dump. This idea seems straightforward enough, and debriefing the retiree and those with whom he works closely about what issues they perceive as likely to surface or that could possibly arise is obvious common sense. But only in special cases is the full data dump approach likely to be very useful. When a current employee has a knowledge need, is he or she likely to look for the information, and if so how likely is it that the employee’s search request will map onto the information in the retiree’s data dump?

Much more likely to be useful is to keep the retiree involved, maintaining him or her in the CoPs, involved in the discussions concerning current issues, and findable through expertise locator systems. The real utility is likely to be found not directly in the information that the retiree leaves behind, but in new knowledge created by the interaction of the retiree with current employees. The retiree, in response to a current issue says “it occurs to me that …” and elicits a response something like “yes, the situation is somewhat similar , but here …,” a discussion unfolds, the retiree contributes some of the needed expertise, and a solution is generated. The solution arises partially from the retiree’s knowledge, but more from the interaction.

The Scope of KM

Another major development is the expansion of KM beyond the 20th century vision of KM as the organization’s knowledge as described in the Gartner Group definition of KM. Increasingly KM is seen as ideally encompassing the whole bandwidth of information and knowledge likely to be useful to an organization, including knowledge external to the organization—knowledge emanating from vendors, suppliers, customers, etc., and knowledge originating in the scientific and scholarly community, the traditional domain of the library world. Looked at in this light, KM extends into environmental scanning and competitive intelligence.

The additional definition of KM above, “Yet One More Definition of KM,” the forest of the trees, also makes the case that the definition of KM is now very broad.

Is KM here to stay?

The answer certainly appears to be yes. The most compelling analysis is the bibliometric one, simply counting the number of articles in the business literature and comparing that to other business enthusiasms. Most business enthusiasms grow rapidly and reach a peak after about five years, and then decline almost as rapidly as they grew.

Below are the graphs for three hot management topics (or fads) of recent years:

Quality Circles, 1977-1986

Source: Abrahamson ,1996

Total Quality Management, 1990-2001 - Source: Ponzi & Koenig, 2002
Total Quality Management, 1990-2001

Source: Ponzi & Koenig, 2002

Business Process Reengineering, 1990-2001 - Source: Ponzi & Koenig, 2002

Business Process Reengineering, 1990-2001

Source: Ponzi & Koenig, 2002

KM looks dramatically different:

A articles in the business literature with the phrase “Knowledge Management” in the title.
This graph charts the number of articles in the business literature with the phrase “Knowledge Management” in the title.

If we chart the number of articles in the business literature with the phrase “Knowledge Management” or the abbreviation “KM” in the title, we get the chart below, with an order of magnitude more literature:

KM Growth 2001-2011

It does indeed look as though KM is no mere enthusiasm; KM is here to stay.


Abrahamson, E. & Fairchild, G. (1999). Management fashion: lifecycles, triggers, and collective learning processes. Administrative Science Quarterly, 44, 708-740.

Davenport, Thomas H. (1994), Saving IT’s Soul: Human Centered Information Management. Harvard Business Review, March-April, 72 (2)pp. 119-131. Duhon, Bryant (1998), It’s All in our Heads. Inform, September, 12 (8).

Durham, Mary. (2004). Three Critical Roles for Knowledge Management Workspaces. In M.E.D. Koenig & T. K. Srikantaiah (Eds.), Knowledge Management: Lessons Learned: What Works and What Doesn’t. (pp. 23-36). Medford NJ: Information Today, for The American Society for Information Science and Technology.

Koenig, M.E.D. (1990) Information Services and Downstream Productivity. In Martha E. Williams (Ed.), Annual Review of Information Science and Technology: Volume 25, (pp. 55-56). New York, NY: Elsevier Science Publishers for the American Society for Information Science.

Koenig, M.E.D. (1992). The Information Environment and the Productivity of Research. In H. Collier (Ed.), Recent Advances in Chemical Information, (pp. 133-143). London: Royal Society of Chemistry. Mazzie, Mark. (2003). Personal Communication.

Koenig, M, E. D. (2000), The Evolution of Knowledge Management, in T. K. Srikantaiah and M. E. D. Koenig, Knowledge Management for the Information Professional. (pp. 23-26), Medford N.J., Information Today, for the American Society for Information Science.

McInerney, Claire, and Koenig, Michael E. D., (2011), Knowledge Management (KM) Processes in Organizations: Theoretical Foundations and Practice, Morgan and Claypool.

Nonaka, I. & Takeuchi, H. (1995). The knowledge creating company: How Japanese Companies Create the Dynamics of Innovation. New York: Oxford University Press.

Ponzi, Leonard., & Koenig, M.E.D. (2002). Knowledge Management: Another Management Fad?” Information Research, 8(1). Retrieved from

Ponzi, L., & Koenig, M.E.D. (2002). Knowledge Management: Another Management Fad?”, Information Research, 8(1). Retrieved from

Prusak, Larry. (1999). Where did Knowledge Management Come From?. Knowledge Directions, 1(1), 90-96. Prusak, Larry. (2004). Personal Communication.

Senge, Peter M.. (1990). The Fifth Discipline: The Art & Practice of the Learning Organization. New York, NY: Doubleday Currency.

Wenger, Etienne C. (1998). Communities of practice: Learning, meaning and identity. Cambridge: Cambridge University Press.

Wenger, Etienne C. & Snyder, W. M. (1999). Communities of practice: The organizational frontier. Harvard Business Review, 78(1), 139-145.

About the Author

Michael E.D. Koenig, Ph.D, is the author or co-author of a number of books on KM, including Knowledge Management in Practice (, and numerous articles on the subject of KM. He is Professor Emeritus at Long Island University and is the former and founding dean of the College of Information and Computer Science. In 2015 he received the Award of Merit from the Association for Information Science and Technology, the association’s highest award.

7 Ed Tech Trends to Watch in 2018 – Campus Technology

Virtual Roundtable

7 Ed Tech Trends to Watch in 2018

What education technologies and trends will have the most impact in the coming year? We asked four higher ed IT leaders for their take.

Whenever we analyze the landscape of higher education technology, we find a range of trends in various stages of development. There are topics with real staying power, such as learning space design (which has factored into our trends list for several years). Others have evolved over time: Virtual reality made our list in 2016, then expanded to include augmented and mixed reality in 2017, and this year makes up part of a broader concept of immersive learning. And while some topics, like video, have been around for ages, new developments are putting them into a different light.

To help make sense of it all, we asked a panel of four IT leaders from institutions across the country for their thoughts. Here’s what they told us.

Our Panelists

Brian Fodrey

Assistant Dean for Facilities & Information Technology and Chief Information Officer, School of Government, The University of North Carolina at Chapel Hill

David Goodrum

Director of Academic Technology, Information Services, Oregon State University

Thomas Hoover

Chief Information Officer and Dean of the Library, University of Louisiana Monroe

Anu Vedantham

Director of Learning and Teaching Services for FAS Libraries, Harvard Library, Harvard University

1) Data-Driven Institutions

Brian Fodrey: In the age of big data, with leaders focused on making data-driven decisions, having a data and information management strategy in place in IT is no longer just a luxury, but quickly becoming a necessity.

A unified data standardization effort can make all systems and processes better and can be directly managed by assessing how data is collected, cleansed and ultimately stored. Employing a data in-information out mindset forces us to be strategic in why data is being requested, how it is solicited and the manner in which it will inform future offerings, services and/or systems enterprise-wide. Additionally, having reliable data sets also lessens the need for redundant collection points to exist at various application levels, and instead creates a more uniform and positive user experience.

Beyond the capturing and management of data, understanding and recognizing the diversity in where and how all constituents at an institution are consuming various data sets can also lead to learning more about those who value our information, utilize our services and influence how we collect data in the future.

Thomas Hoover: Data and big data have been buzzwords — rightfully so — for the last several years. Universities are making great progress when it comes to using data to help with retention and student success. However, there is still much room for improvement to take advantage of data-driven decision-making across the entire campus.

For instance, data can be used to determine if classrooms are being utilized optimally before new construction projects are kicked off. It can and should be used to determine if aging computer labs should be renewed or transformed into something that is more useful to the university. Efforts like these can not only streamline campus operations, but also ensure that we are making most of the resources we have in the service of teaching and learning.

Another area where data can be used more is GIS data. Historically, GIS data has primarily been used in the hard sciences — but that same data could be analyzed in practically any class on a college campus. Think history, political science, criminal justice, urban planning — there is so much data out there, and we can all do a better job of using it.

Volume 65 Issue 9

Report of the Ad Hoc Consultative Committee for the Selection of a Dean of the School of Dental Medicine

  • October 16, 2018
  • vol 65 issue 9
  • News

  • print
  • The Ad Hoc Consultative Committee for the Selection of a Dean of the School of Dental Medicine (SDM) was convened by President Amy Gutmann on September 26, 2017. During its four months of work, the full Committee met on nine occasions and reported its recommendations to the President and the Provost on February 1, 2018. The Committee members were:

    Chair: Antonia Villarruel, Professor and Margaret Bond Simon Dean of Nursing

    Faculty: Hydar Ali, Professor of Pathology and Director of Faculty Advancement and Diversity, SDM

    Faizan Alawi, Associate Professor of Pathology; Director, Penn Oral Pathology Services; and Associate Dean for Academic Affairs, SDM

    Kathleen Boesze-Battaglia, Professor of Biochemistry, SDM

    Eve Higginbotham, Professor of Ophthalmology and Vice Dean for Inclusion and Diversity, PSOM

    Kelly Jordan-Scuitto, Chair and Professor of Pathology (SDM) and Associate Dean for Graduate Education and Director of Biomedical Graduate Studies (PSOM)

    Bekir Karabucak, Chair and Associate Professor of Endodontics, SDM

    Eric Stoopler, Associate Professor of Oral Medicine and Director, Oral Medicine Residency Program, SDM

    Students: Sehe Han, D’18

    Bret Lesavoy, D’19

    Alumni: William Cheung, Chair of the Board of Overseers

    Martin Levin, Member of the Board of Overseers

    Ex Officio: Joann Mitchell, Senior Vice President for Institutional Affairs and Chief Diversity Officer

    The search was supported by Adam P. Michaels, Deputy Chief of Staff in the President’s Office, and Dr. Warren Ross of the executive search firm Korn Ferry.

    The Committee and its consultants conducted informational interviews and consultative meetings with individuals and groups throughout the Penn and Penn Dental Medicine communities, as well as many informal contacts, in order to better understand the scope, expectations and challenges of the Dean’s position and the opportunities facing the University in the years ahead. These consultative activities included full Committee meetings with Dean Denis Kinane and Interim Dean Designate Dana Graves and members of the Penn Dental Medicine leadership team, including the associate deans. In addition, the Chair and the Committee members held open meetings for various Penn Dental Medicine constituencies. The consultants interviewed administrators from the central administration and from Penn Dental Medicine and sought nominations from academics and practitioners across the nation and the world as well as from leaders in government, foundations, academic societies and other organizations. Finally, members of the Committee engaged in extensive networking with Penn faculty and students, as well as colleagues at other institutions. The Committee also solicited advice and nominations from Penn Dental Medicine faculty, staff and students as well as Penn Deans and faculty and staff from across the campus via email and reviewed a variety of documents about the school.

    Based upon these conversations and materials, the Committee’s charge from the President and the Provost, and the Committee’s own discussions, a comprehensive document was prepared outlining the scope of the position and the challenges a new Dean will face, as well as the qualities sought in a new Dean. The vacancy was announced (and input invited from the entire Penn community) in Almanac.

    Over the course of its four-month search process, the Committee and its consultants contacted and considered more than 230 individuals for the position. From this group, the committee evaluated an initial pool of 43 nominees and applicants and ultimately selected 10 individuals for semi-finalist interviews with the entire Committee. Based on voluntary self-identifications and other sources, we believe the initial pool of 43 contained eight women and 35 men, and five people of color. The five individuals recommended for consideration to the President included two women and were selected from this group of 10 semi-finalists.

    On March 29, 2018, President Gutmann and Provost Pritchett announced the selection of Dr. Mark Wolff as the Morton Amsterdam Dean of Penn Dental Medicine. Dr. Wolff is a celebrated teacher, globally engaged scholar and deeply experienced clinician who served as professor and chair of cariology and comprehensive care in the College of Dentistry at New York University. He assumed his office on July 1, 2018, after ratification by the Trustees at their June meeting.

    —Antonia M. Villarruel, Professor and Margaret Bond Simon Dean of Nursing; Chair, Consultative Committee on the Selection of a Dean of the School of Dental Medicine

    Nominations for University-Wide Teaching Awards: December 7

  • October 16, 2018
  • vol 65 issue 9
  • News

  • print
  • Nominations for Penn’s University-wide teaching awards are now being accepted by the Office of the Provost. Any member of the University community—past or present—may nominate a teacher for these awards. There are three awards:

    The Lindback Award for Distinguished Teaching honors eight members of the standing faculty—four in the non-health schools (Annenberg, Design, SEAS, GSE, Law, SAS, SP2, Wharton) and four in the health schools (Dental Medicine, PSOM, Nursing, Veterinary Medicine).

    The Provost’s Award for Distinguished PhD Teaching and Mentoring honors two faculty members for their teaching and mentoring of PhD students. Standing and associated faculty in any school offering the PhD are eligible for the award.

    The Provost’s Award for Teaching Excellence by Non-Standing Faculty honors two members of the associated faculty or academic support staff who teach at Penn, one in the non-health schools and one in the health schools.

    Nomination forms are available at the Teaching Awards website, The deadline for nominations is Friday, December 7, 2018. Full nominations with complete dossiers prepared by the nominees’ department chairs are due Friday, February 1, 2019.

    Note: For the Lindback and Non-Standing Faculty awards, the health schools—Dental Medicine, Nursing, PSOM and Veterinary Medicine—have a separate nomination and selection process. Contact the relevant Dean’s Office to nominate a faculty member from one of those schools.

    There will be a reception honoring all the award winners in the spring. For information, please email or call (215) 898-7225.

    Criteria and Guidelines

    1. The Lindback and Provost’s Awards are given in recognition of distinguished teaching. “Distinguished teaching” is teaching that is intellectually demanding, unusually coherent and permanent in its effect. The distinguished teacher has the capability of changing the way in which students view the subject they are studying. The distinguished teacher provides the basis for students to look with critical and informed perception at the fundamentals of a discipline, and s/he relates that discipline to other disciplines and to the worldview of the student. The distinguished teacher is accessible to students and open to new ideas, but also expresses his/her own views with articulate and informed understanding of an academic field. The distinguished teacher is fair, free from prejudice and single-minded in the pursuit of truth.

    2. Skillful direction of dissertation students, effective supervision of student researchers, ability to organize a large course of many sections, skill in leading seminars, special talent with large classes, ability to handle discussions or structure lectures—these are all attributes of distinguished teaching, although it is unlikely that anyone will excel in all of them. At the same time, distinguished teaching means different things in different fields. While the distinguished teacher should be versatile, as much at home in large groups as in small, in beginning classes as in advanced, s/he may have skills of special importance in his/her area of specialization. The primary criteria for the Provost’s Award for Distinguished PhD Teaching and Mentoring are a record of successful doctoral student mentoring and placement, success in collaborating on doctoral committees and graduate groups and distinguished research.

    3. Since distinguished teaching is recognized and recorded in different ways, evaluation must also take several forms. It is not enough to look solely at letters of recommendation from students or to consider “objective” evaluations of particular classes in tabulated form. A faculty member’s influence extends beyond the classroom and individual classes. Nor is it enough to look only at a candidate’s most recent semester or opinions expressed immediately after a course is over; the influence of the best teachers lasts, while that of others may be great at first but lessen over time. It is not enough merely to gauge student adulation, for its basis is superficial; but neither should such feelings be discounted as unworthy of investigation. Rather, all of these factors and more should enter into the identification and assessment of distinguished teaching.

    4. The Lindback and Provost’s Awards have a symbolic importance that transcends the recognition of individual merit. They should be used to advance effective teaching by serving as reminders to the University community of the expectations for the quality of its mission.

    5. Distinguished teaching occurs in all parts of the University. Therefore, faculty members from all schools are eligible for consideration. An excellent teacher who does not receive an award in a given year may be re-nominated in some future year and receive the award then.

    6. The Lindback and Provost’s Awards may recognize faculty members with many years of distinguished service or many years of service remaining. The teaching activities for which the awards are granted must be components of the degree programs of the University of Pennsylvania.

  • News
    • NIH Director’s Awards for Seven Penn Faculty
    • SEAS Team: Naval Research Grant
    • Report of the Ad Hoc Consultative Committee for the Selection of a Dean of the School of Dental Medicine
    • Nominations for University-Wide Teaching Awards: December 7
  • Deaths
    • Bernard Carroll, Psychiatry
    • Jay Kislak, Kislak Center
  • Governance
    • From the Senate Office: Faculty Senate Executive Committee Actions
  • Honors
    • Amber Alhadeff: L’Oreal Women in Science Fellowship
    • Liang Feng: Optical Society Fellow
    • Three PSOM Faculty: Career Award for Medical Scientists
    • Deep Jariwala: Young Investigator Award
    • Vincent Reina: National Public Policy Fellowship
    • Kimberly Trout: American College of Nurse-Midwives Fellow
    • Chioma Woko: Health Policy Research Scholar
    • Two SEAS Teams: NSF RAISE EQuIP Grants
    • Teams from School of Nursing, SAS: Green Purchasing Awards
  • Research
    • Prenatal Gene Editing for Treating Congenital Disease
    • Regrowing Dental Tissue with Baby Teeth Stem Cells
    • Reducing Political Polarization on Climate Change
    • New Insights on Interprofessional Health-Care Training
  • Events
    • Diversity Lecture: Sexual Assault in America
    • Live Music at the Annenberg Center
    • BioArt and Bacteria at the Esther Klein Gallery
    • Update: October AT PENN
  • Crimes
    • Weekly Crime Reports
  • Bulletins
    • A Drug-Free Workplace
    • Penn’s Way 2019 Week One Winners and Week Three Prizes
  • Download Issue


    Bernard Carroll, Psychiatry

  • October 16, 2018
  • vol 65 issue 9
  • Deaths

  • print
  • Bernard J. Carroll, former professor of psychiatry in Penn’s Perelman School of Medicine, died September 10 at his home in Carmel, California, from lung cancer. He was 77.

    Dr. Carroll was born in Australia and graduated from the University of Melbourne in 1964 with degrees in psychiatry and medicine. When he was 28 years old, he developed a test called the dexamethasone suppression test, or DST, based in biology rather than Freudian theory. However the test never met widespread use because around that same time, how types of depression were classified changed and modern antidepressants hit the market, changing how studies were interpreted and shared and what new knowledge was pursued.

    A few years later, in 1971, he came to Penn as a clinical research fellow in the department of psychiatry, and he served as an assistant professor of psychiatry 1972-1973. He went on to positions at University of Michigan and Duke, where he earned emeritus status. He served as clinical director of a geriatric hospital outside Durham, North Carolina.

    Dr. Carroll is survived by his wife, Sylvia.

    Jay Kislak, Kislak Center

  • October 16, 2018
  • vol 65 issue 9
  • Deaths

  • print
  • Jay I. Kislak (W’43), real estate magnate and long-time supporter of the University of Pennsylvania, died October 3 at his home in Miami, Florida. He was 96.

    Passionate about rare books, manuscripts and historical artifacts, Mr. Kislak donated $5.5 million to Penn (Almanac September 17, 2013), a gift that was key to renovating Van Pelt-Dietrich Library’s 5th and 6th floors and created the sleek, modern Kislak Center for Special collections, Rare Books and Manuscripts, which debuted in 2012 (Almanac April 16, 2013). To date, it was the largest cash contribution from an individual donor in the Libraries’ history.

    Mr. Kislak, a native of Hoboken, New Jersey, got his first real estate license in high school. After earning an economics degree from Wharton, he served as a US Navy pilot in World War II. In the 1950s, he moved to Florida and expanded his family’s business into a privately held real estate and financial services empire.

    Mr. Kislak’s passion for rare books, manuscripts and historical artifacts began early. Starting first with books, he began to focus his collecting interests on Florida and the Americas, later turning to art and artifacts. Collaborating with his wife, Jean, he assembled widely diverse collections encompassing many interest areas. In 2004, more than 3,000 books and other objects from their collection became a gift to the nation, now known as the Jay I. Kislak Collection at the Library of Congress in Washington, DC. He also made notable donations to create Kislak Centers at the University of Miami and Miami Dade College’s Freedom Tower.

    He is survived by his wife, Jean; children, Jonathan, Philip (C’70) and Paula; stepdaughter Jennifer Rettig; grandchildren, Rebecca, Jason, Tamara, Libby (W’10) and Jane; great-grandchildren Ezra, Simon, Kayla, Julia, Stokes and Aura; and his brother, David.


    From the Senate Office: Faculty Senate Executive Committee Actions

  • October 16, 2018
  • vol 65 issue 9
  • Governance

  • print
  • The following is published in accordance with the Faculty Senate Rules. Among other purposes, the publication of SEC actions is intended to stimulate discussion among the constituencies and their representatives. Please communicate your comments to Patrick Walsh, executive assistant to the Senate Office, either by telephone at (215) 898-6943 or by email at

    Faculty Senate Executive Committee Actions
    Wednesday, October 10, 2018

    2018 Senate Nominating Committee. Pursuant to the Faculty Senate Rules, the members of SEC were requested to submit the name of a member of the Standing Faculty to appear on the Nominating Committee ballot.

    Update from the Office of the Provost. Provost Wendell Pritchett offered an update on a number of topics. The Provost’s Office and the Online Learning Initiative are co-hosting a summit on campus October 12, 2018, with the University of the Future Network to discuss how globalization, online learning and other changes that are transforming the university of the future. The Take Your Professor to Lunch program continues during 2018-2019, and a notice will be sent to students in the coming weeks. Benoit Dubé is focusing his initial efforts as Chief Wellness Officer on student wellness initiatives; Provost Pritchett thanked the Faculty Senate for its role in recommending the establishment of the Chief Wellness Officer position. The Penn First Plus program has been created to support first-generation and high-need students; two faculty co-directors, Camille Charles and Robert Ghrist, have been appointed to lead the effort. Several faculty development initiatives are in place to try to further the development of faculty at all levels, including the Penn Faculty Fellows program and other efforts to support both junior faculty who are working toward promotion and tenure and faculty who are in management or leadership roles. The Office of the Vice Provost for Faculty helped host the conference “Changing the National Conversation: Inclusion and Equity” held in September, which included participation by presidents and provosts from more than 100 universities; a report will be issued later this year. A conversation between the Provost and SEC members ensued.

    Human Capital Management Project Update. Vice Provost for Faculty Anita Allen and Associate Provost for Finance and Planning Mark Dingfield described progress on replacing Penn’s existing payroll and faculty management systems with new cloud-based products Workday and Interfolio, respectively. The cloud-based systems will replace current mainframe systems, which will make them more secure. Almost every constituency in the University will be impacted, and the new products will reduce inefficiencies and streamline processes for faculty hiring, recruitment, promotion, tenure, sabbatical-tracking, and more. The systems will launch on July 1, 2019, and a period of disruption during summer 2019 is expected as users acclimatize to them. It is anticipated that both products will be adopted widely across the University. Hands-on training will begin in April and will include classroom-based training sessions and online training modules. On-demand training will be available (e.g., for faculty search committees).

    Moderated Discussion. SEC members discussed which specific topics to address in an in-depth manner during the year. Several topics were identified, of which two will be selected at the next meeting.


    Amber Alhadeff: L’Oreal Women in Science Fellowship

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Amber Alhadeff, a postdoc researcher in the department of biology at Penn, is one of the five recipients of the L’Oréal USA 2018 For Women in Science Fellowship. The fellowships are awarded annually to female postdoctoral scientists. The $60,000 grant is given to advance her research.

    Dr. Alhadeff’s research focuses on understanding the neural circuits and molecular mechanisms that control food intake. This research will give scientists valuable insight into how to treat metabolic disease such as obesity, eating disorders and type II diabetes. The L’Oréal USA For Women in Science fellowship will provide Dr. Alhadeff funding to further her research, including support to hire two female undergraduate students. During her fellowship, Dr. Alhadeff will also serve as a mentor to local middle and high school girls with a special focus on STEM.

    Liang Feng: Optical Society Fellow

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Liang Feng, assistant professor in the departments of materials science & engineering and electrical & systems engineering in SEAS, has been elected a fellow of the Optical Society.

    Since 1916, the scholarly society has been the “world’s leading champion for optics and photonics, uniting and educating scientists, engineers, educators, technicians and business leaders worldwide to foster and promote technical and professional development.”

    Dr. Feng joined Penn Engineering last year, among the ranks of 30 new faculty hired in a two-year span. That group includes a concentration of experts in data science and new computational techniques, areas that Dr. Feng approaches from his background in developing nanomaterials that provide unprecedented control over light.

    The Optical Society cited Dr. Feng for his “outstanding pioneering scientific contributions to the field of non-Hermitian photonics and its applications in integrated nanophotonics and optoelectronics.”

    Dr. Feng was also recently awarded an NSF grant from its Engineering Quantum Integrated Platforms program; he and fellow MSE professor Ritesh Agarwal will use it to build quantum communication devices that take advantage of chiral properties of individual photons (see article).

    Three PSOM Faculty: Career Award for Medical Scientists

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Three Perelman School of Medicine faculty members at the University of Pennsylvania have received 2018 Burroughs Wellcome Fund Career Awards for Medical Scientists. Elizabeth Joyce Bhoj, assistant professor of pediatrics, for research on “a novel pediatric neurodegenerative disorder caused by histone 3.3 mutations: unique insights into the histone code”; Sarah Emily Henrickson, instructor of pediatrics, for “directly interrogating mechanisms of human T cell dysfunction in the setting of chronic inflammation and atopy”; and Mark Sellmyer, assistant professor of radiology, for “engineering digital logic for cell-cell interactions.”

    The Career Awards for Medical Scientists (CAMS) is a highly competitive program that provides $700,000 awards over five years to physician-scientists who are committed to an academic career, to bridge advanced postdoctoral/fellowship training and the early years of faculty service.

    Deep Jariwala: Young Investigator Award

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • The journal Nanomaterials has named Deep Jariwala, assistant professor in the department of electrical and systems engineering in Penn’s School of Engineering, the winner of its annual Young Investigator awards, as selected by the journal’s editorial board.

    Dr. Jariwala is an expert in nano- and atomic-scale devices that could have applications in information technology and renewable energy, among other fields. In giving him the award, Nanomaterials noted, “Dr. Jariwala’s impressive work combines novel nanomaterials, such as carbon nanotubes and 2D transition metal dichalcogenides, into heterostructures and electronic and optoelectronic devices. His work encompasses synthesis of nanomaterials, characterization of their electronic and optical properties, and then fabrication of them into devices, such as diodes, FETs and photodetectors.”

    Vincent Reina: National Public Policy Fellowship

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Vincent Reina, assistant professor in the department of city and regional planning at PennDesign has been awarded a fellowship from The Association for Public Policy Analysis & Management (APPAM). The 40 for 40 Fellowships provide funding for early-career research professionals to attend APPAM’s Fall Research Conference in Washington, DC. APPAM notes that promoting the work of early-career professionals like Dr. Reina is intended to shape the future of public policy research.

    This is the second time Dr. Reina has been honored by APPAM. He earned the organization’s prestigious Dissertation Award in 2016. His dissertation, “The Impact of Mobility and Government Subsidies on Household Welfare and Rents,” examines the behavior of landlords who provide affordable housing and the formation of policies to ensure the availability of affordable housing for low income households. Dr. Reina’s research focuses on urban economics, low-income housing policy, household mobility and the role of housing in community and economic development.

    Kimberly Trout: American College of Nurse-Midwives Fellow

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Kimberly Kovach Trout, assistant professor of women’s health in the department of family and community health and the track lead of the nurse-midwifery graduate program in Penn Nursing, has been named a fellow in the American College of Nurse-Midwives (ACNM).

    Fellowships in the ACNM are awarded to midwives who have demonstrated leadership, clinical excellence, outstanding scholarship and professional achievement and who have merited special recognition both within and outside of the midwifery profession. The fellowship’s mission is to serve the ACNM in a consultative and advisory capacity.

    Dr. Trout’s induction ceremony took place this past May during the ACNM 63rd Annual Meeting & Exhibition in Savannah, Georgia.

    Chioma Woko: Health Policy Research Scholar

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • The Robert Wood Johnson Foundation recently announced that Chioma Woko, a doctoral student in Penn’s Annenberg School for Communication, has been named to its 2018 cohort of 40 Health Policy Research Scholars.

    Designed for second-year doctoral students from underrepresented populations and disadvantaged backgrounds, the Health Policy Research Scholars program helps researchers from all fields— from economics to epidemiology—apply their work to policies that advance equity and health while building a diverse field of leaders who reflect our changing national demographics. The four- to five-year program provides participants with an annual stipend of up to $30,000.

    Ms. Woko is a health communication doctoral student studying health behaviors online. She is conducting research on what factors influence people in social networks to carry out health behaviors, such as physical activity, contraceptive use and tobacco-related behaviors.

    Focusing on Black American populations, Ms. Woko’s work is based on evidence that suggests that different demographic groups use online resources for health in different ways, which are inherently related to disparities in health literacy and access to health resources. Ultimately, she hopes that her work will inform policy development that will impact the health outcomes of all marginalized groups.

    Ms. Woko previously held a position at RTI International, where she worked on government funded research projects on food, nutrition and obesity policy. She will be advised through the program by John B. Jemmott III, Kenneth B. Clark Professor of Communication & Psychiatry and director of the Center for Health Behavior and Communication Research.

    Two SEAS Teams: NSF RAISE EQuIP Grants

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • The National Science Foundation has awarded two of eight Research Advanced by Interdisciplinary Science and Engineering (RAISE) grants to teams from Penn’s School of Engineering and Applied Science for their proposed quantum information science research. Each team will receive $750,000 over the next three years.

    The RAISE Engineering Quantum Integrated Platforms (EQuIP) grants are designed to propel advances in quantum information science, which aims to harness the inherent quantum-mechanical properties of light and matter for new technologies. The EQuIP program focuses on quantum communication, which explores how information can be condensed, sent and stored.

    A team headed by Lee Bassett, assistant professor in the department of electrical and systems engineering, will explore how individual impurity atoms in a diamond can be used as a platform for quantum communication. The goal is to develop compact, chip-scale devices that operate as small quantum computers coupled to single photons in optical fibers, which can serve as the backbone of a future quantum internet. The team includes members of the Electronic Photonic Microsystems Lab led by Firooz Aflatouni, the Skirkanich Assistant Professor of Electrical Systems and Engineering, as well as a group at Brown University. The team will also collaborate with Tim Taminiau and his group at QuTECH in Delft University of Technology in The Netherlands.

    The second Penn team is led by Ritesh Agarwal, professor of materials science and engineering, and Liang Feng, assistant professor of materials science and engineering and of electrical and systems engineering. Together with Stefan Strauf professor of physics at the Stevens Institute of Technology and an expert in quantum signal generation, they will advance quantum communication by using advanced nanophotonic technology, delivering revolutionary quantum circuits that generate and process quantum signals via a single photon. Dr. Agarwal said, “We’re not just reducing the size—we’re reducing the cost. That’s our dream, to make this technology accessible to everyone.”

    Both teams plan to integrate undergraduate and graduate students into the research and will participate in educational outreach programs to facilitate interest in quantum information science in students from preschool through 12th grade.

    Teams from School of Nursing, SAS: Green Purchasing Awards

  • October 16, 2018
  • vol 65 issue 9
  • Honors

  • print
  • Penn’s Green Purchasing Awards, presented by Penn Purchasing Services and Penn Sustainability, were announced at the annual Purchasing Services Supplier Show on September 25.

    The award program recognizes the outstanding contributions of an individual or team that significantly advance the development of sustainable purchasing practices at Penn.

    “With Penn’s dedication to environmental sustainability, it’s important to acknowledge the outstanding contributions being made at the University. We work in a decentralized purchasing environment with daily buying decisions that are made at the department level,” said Mark Mills, executive director of Penn Purchasing Services. “Given this model, it’s important to recognize our colleagues in the Schools and Centers who have embraced sustainability in their purchasing choices. Our 2018 honorees are making smart, responsible purchasing decisions and instituting new programs—many of which can be shared and repeated across the University.”

    The first 2018 award was given to the School of Nursing’s One Less campaign team. A series of green gifts were chosen for faculty and staff and distributed at the school’s annual Service and Recognition Awards event. This all-volunteer team worked on- and off-duty to design the logo and reach consensus on the choice of this year’s reusable items. Among them were small tote bags, which can remove disposable plastic bags from the waste stream, and reusable travel mugs, which can eliminate 23 pounds of waste annually per person using them. The team negotiated with the school’s café operator to provide an ongoing discount to anyone who uses those travel mugs (or any reusable cup), incentivizing members of the community to make a green purchase daily. The award recipients included Patricia Adams, Lucia DiNapoli, Olivia Duca, Joseph Gomez, Karen Keith-Ford, Theresa Lake, Holly Marrone, Seymour Sejour and Meredith Swinney.

    The second award recipient was the Furniture Reuse and Recycling team from SAS. The team created a system that strives to divert used furniture from landfills. The process begins by creating a monthly inventory of all used furniture available in the School. The inventory is then circulated for review to SAS’s building administrators and departments. Then the list is sent to the Netter Center for Community Partnerships—reaching dozens of charity partners who may be able to reuse furniture listed on the inventory. Furniture that cannot be reused within SAS or by charity partners is recycled by Revolution Recovery. Revolution Recovery is able to divert over 80% of SAS furniture that has reached the end of its useful life from landfills. In the last four quarters for which program metrics are available (FY17 Q4-FY18 Q3), SAS has diverted 20.86 tons of furniture from landfills. That is an 88.8% diversion rate overall. The honorees from the SAS team are Jonathan Burke, Carvel Camp, Floyd Emelife, Brittany Gross, Ruth Kelley, Ryshee McCoy and Isabel Sampson-Mapp.

    Both initiatives align with Penn’s Climate Action Plan 2.0, the University’s comprehensive strategic roadmap for environmental sustainability. For more information about the recipients, visit


    Prenatal Gene Editing for Treating Congenital Disease

  • October 16, 2018
  • vol 65 issue 9
  • Research

  • print
  • For the first time, scientists performed prenatal gene editing to prevent a lethal metabolic disorder in laboratory animals, offering the potential to treat human congenital diseases before birth. Published in Nature Medicine, research from the Perelman School of Medicine at the University of Pennsylvania and the Children’s Hospital of Philadelphia (CHOP) offers proof-of-concept for prenatal use of a sophisticated, low-toxicity tool that efficiently edits DNA building blocks in disease-causing genes.

    The team reduced cholesterol levels in healthy mice treated in utero by targeting a gene that regulates those levels. They also used prenatal gene editing to improve liver function and prevent neonatal death in a subgroup of mice that had been engineered with a mutation causing the lethal liver disease hereditary tyrosinemia type 1 (HT1).

    HT1 in humans usually appears during infancy, and it is often treatable with a medicine called nitisinone and a strict diet. However, when treatments fail, patients are at risk of liver failure or liver cancer. Prenatal treatment could open a door to disease prevention for HT1 and potentially for other congenital disorders.

    “Our ultimate goal is to translate the approach used in these proof-of-concept studies to treat severe diseases diagnosed early in pregnancy,” said study co-leader William H. Peranteau, a pediatric and fetal surgeon in CHOP’s Center for Fetal Diagnosis and Treatment and assistant professor of surgery in the Perelman School of Medicine. “We hope to broaden this strategy to intervene prenatally in congenital diseases that currently have no effective treatment for most patients and result in death or severe complications in infants.”

    In this study, the scientists used base editor 3 (BE3) to form a partially active version of the CRISPR-Cas 9 tool and harnesses it as a homing device to carry an enzyme to a highly specific genetic location in the liver cells of fetal mice. The enzyme chemically modified the targeted genetic sequence, changing one type of DNA base to another. BE3 does not fully cut the DNA molecule and leave it vulnerable to unanticipated errors when the cut is repaired, as has been seen with the CRISPR-Cas9 tool.

    After birth, the mice in the study carried stable amounts of edited liver cells for up to three months after the treatment, with no evidence of unwanted, off-target editing at other DNA sites. In the subgroup of the mice bioengineered to model HT1, BE3 improved liver function and preserved survival. The BE3-treated mice were also healthier than mice receiving nitisinone, the current first-line treatment for HT1 patients. To deliver CRISPR-Cas9 and BE3, the scientists used adenovirus vectors, but they are investigating alternate delivery methods such as lipid nanoparticles, which are less likely to stimulate unwanted immune responses.

    Regrowing Dental Tissue with Baby Teeth Stem Cells

  • October 16, 2018
  • vol 65 issue 9
  • Research

  • print
  • When trauma affects an immature permanent tooth, it can hinder blood supply and root development, resulting in what is essentially a “dead” tooth. Until now, the standard of care has entailed a procedure called apexification that encourages further root development, but it does not replace the lost tissue from the injury and causes root development to proceed abnormally.

    New results from a clinical trial, jointly led by Songtao Shi of the University of Pennsylvania and Yan Jin, Kun Xuan and Bei Li of the Fourth Military Medicine University in Xi’an, China, suggest that there is a more promising path: using stem cells extracted from the patient’s baby teeth. Dr. Shi and colleagues have learned more about how these dental stem cells, called human deciduous pulp stem cells (hDPSC) work and how they could be safely employed to regrow dental tissue, known as pulp.

    The Phase 1 trial, conducted in China, enrolled 40 children who had each injured one of their permanent incisors and still had baby teeth. Thirty were assigned to hDPSC treatment and 10 to the control treatment, apexification. Those who received hDPSC treatment had tissue extracted from a healthy baby tooth. The stem cells from this pulp were allowed to reproduce in a laboratory culture, and the resulting cells were implanted into the injured tooth. Upon follow-up, the researchers found that patients who received hDPSCs had more signs than the control group of healthy root development and thicker dentin, the hard part of a tooth beneath the enamel, as well as increased blood flow. At the time the patients were initially seen, all had little sensation in the tissue of their injured teeth. A year following the procedure, only those who received hDPSCs had regained some sensation.

    While using a patient’s own stem cells reduces the chances of immune rejection, it is not possible in adult patients who have lost all of their baby teeth. Dr. Shi and colleagues are beginning to test the use of allogenic stem cells, or cells donated from another person, to regenerate dental tissue in adults. They are also hoping to secure FDA approval to conduct clinical trials using hDPSCs in the United States. Eventually, they see even broader applications of hDPSCs for treating systemic disease, such as lupus.

    Reducing Political Polarization on Climate Change

  • October 16, 2018
  • vol 65 issue 9
  • Research

  • print
  • Social media networks may offer a solution to reducing political polarization, according to new findings published in the Proceedings of the National Academy of Sciences from a team led by Damon Centola, associate professor of communication in Penn’s Annenberg School for Communication and the director of the Network Dynamics Group.

    Researchers asked 2,400 Republicans and Democrats to interpret recent climate-change data on Arctic sea-ice levels. Initially, nearly 40 percent of Republicans incorrectly interpreted the data, saying that Arctic sea-ice levels were increasing; 26 percent of Democrats made the same mistake. However, after participants interacted in anonymous social media networks—sharing opinions about the data and its meaning for future levels of Arctic sea ice—88 percent of Republicans and 86 percent of Democrats correctly analyzed it.

    Republicans and Democrats who were not permitted to interact with each other in social media networks but had several additional minutes to reflect on the climate data before updating their responses remained highly polarized and offered significantly less accurate forecasts.

    Dr. Centola, along with Penn doctoral student Douglas Guilbeault and recent Penn PhD graduate Joshua Becker, constructed an experimental social media platform to test how different kinds of social media environments would affect political polarization and group accuracy. The researchers randomly assigned participants to one of three experimental social media groups: a political-identity setup, which revealed the political affiliation of each person’s social media contacts; a political-symbols setup, in which people interacted anonymously through social networks but with party symbols of the donkey and the elephant displayed at the bottom of their screens; and a non-political setup, in which people interacted anonymously. Twenty Republicans and 20 Democrats made up each social network. Once randomized, every individual then viewed a NASA graph with climate change data as well as forecasted Arctic sea-ice levels for the year 2025. They first answered independently, and then viewed peers’ answers before revising their guesses twice more.

    “We were amazed to see how dramatically bipartisan networks could improve participants’ judgments,” said Dr. Centola. In the non-political setup, for example, polarization disappeared entirely, with more than 85 percent of participants agreeing on a future decrease in Arctic sea ice.

    “But,” Dr. Centola added, “…the improvements vanished completely with the mere suggestion of political party.”

    New Insights on Interprofessional Health-Care Training

  • October 16, 2018
  • vol 65 issue 9
  • Research

  • print
  • A recent research study led by Zvi D. Gellis, director of the Center for Mental Health & Aging and the Ann Nolan Reese Penn Aging Certificate Program at Penn’s School of Social Policy & Practice, demonstrates the positive impact of utilizing Interprofessional Education (IPE) simulation-based training to instruct health professions students in team communication.

    The federally funded study, led by Dr. Gellis and his health professions colleagues from Penn and the University of the Sciences, reports on outcomes of a simulation-based “real-world” training among a large group of health professions students comprised of medicine, nursing, chaplaincy and geriatrics social work scholars (from the Penn Aging Certificate Program), as well as University of the Sciences occupational, physical therapy and pharmacy students.

    Dr. Gellis and his research partners examined a comprehensive set of outcomes overlooked in previous work, including attitudes towards health-care teams, self-efficacy in team communication, interprofessional collaboration and satisfaction with the simulation. The research team chose a geriatrics-palliative case study because this specialty has grown significantly in the US. Interprofessional teams frequently treat older patients with prevalent and complex chronic illnesses. Following the training, team communication self-efficacy scale scores and interprofessional collaboration scores increased among the health professions students. In addition, all participants reported more positive attitudes towards working in health-care teams and reported high satisfaction scores, post-simulation.

    The study, published in the journal Gerontology & Geriatrics Education, revealed many advantages to simulation training in health-care education. Simulation training enables students to practice clinical skills in real time among peers and faculty, without jeopardizing the safety of actual patients, and it affords the opportunity to receive immediate patient feedback within a supportive learning environment. Meanwhile, faculty have the chance to lead by example by discussing the significance of interprofessional team roles, participant recruitment in simulation learning with other disciplines, and modeling positive and professional clinical team behaviors. Simulation training can improve performance and self-efficacy in real-world clinical settings, resulting in a better experience for patients and their caregivers.


    Diversity Lecture: Sexual Assault in America

  • October 16, 2018
  • vol 65 issue 9
  • Events

  • print
  • Susan B. Sorenson, professor of social policy at SP2 and executive director of the Ortner Center on Violence and Abuse in Relationships, will discuss From College Campuses to #MeToo: Sexual Assault in America on Wednesday, October 24 as part of The Diversity Lecture Series at Penn. The noon lecture at the second-floor meeting room of the Penn Bookstore is free and open to the public.

    Dr. Sorenson will discuss how views on sexual assault have changed during the past 50 years with a particular focus on the role of college campuses. The hour will be split between her talk and a conversation about what might be next.

    The Diversity Lecture Series is intended to give insight and understanding of multicultural issues and is designed to introduce an essential component of education in helping to encourage civil debate, broaden the basis for critical thought and promote cultural understanding.

    To register, visit

    Live Music at the Annenberg Center

  • October 16, 2018
  • vol 65 issue 9
  • Events

  • print
  • The Philadelphians: Migrations That Made Our City: Philadelphia has been shaped by a long history of diverse cultures and traditions. In The Philadelphians, the Chamber Orchestra explores the populations that migrated to and influenced the city, uncovering a unique, shared identity. Audiences will experience two periods in time, a contrast of colonial-era early music with new works that look back on Philly’s history. Along with junto-style discussion groups, period performance and modern interpretations will connect the audience with those who created the cultural landscape.

    Who is Philadelphia? What can we learn from our heritage, and how will our city be changed by new waves of immigrants? Join us as we examine our ancestry through music and discover how we came to be Philadelphians.

    The 2018-2019 season’s focus is African American and English Colonial Experience with the first performance by The Chamber Orchestra of Philadelphia: Origins & Diaspora on Wednesday, October 17 at 7:30 p.m. The program will include West African musical traditions and influences in classical music.

    This will be a unique, interactive chamber music experience with members of The Chamber Orchestra of Philadelphia. Performing in the round, host Jim Cotter will provide background and insight on each work and lead conversations with the musicians between pieces. The performance concludes with a casual audience Q&A. Tickets:

    The Portland Cello Project makes its Annenberg Center Debut at 8 p.m. on Saturday, October 20, performing Radiohead’s OK Computer and more. Cellos and Radiohead were meant to collide, and the results are seriously epic. Portland’s premiere alt-classical group, complete with brass, percussion and vocals, pays tribute to Radiohead with a unique spin on music from the band’s OK Computer album and other favorites. “Every piece is treated with equal sincerity and arranged not just to invoke the original but deconstruct and re-imagine its essence.” (Seattle Times) Expect an evening “where boundaries are blurred and cellos are in abundance.” (The Strad)

    Soul Songs: Inspiring Women of Klezmer will have a world premiere on Sunday, October 28 at 4 p.m.—a one-night-only special event—where 12 women will be breathing contemporary life into the centuries-old tradition of Eastern European Jewish folk music at Annenberg Center’s Zellerbach Theatre. The brainchild of fourth generation klezmer musician and concert artistic director Susan Watts, this performance was created from the world-renowned trumpeter’s concern for the future of her art and appreciation of every individual involved.

    Soul Songs is about the old and new intertwined,” said Ms. Watts, a 2015 Pew Fellow. “It is future provoking, intuitive, grass roots. Soul Songs is about these women’s musical journeys, their artistry and their discernment to use the force of adversity to their gain. It is the klezmer of today and a prelude to future possibilities for the art and the communities it nurtures.” Soul Songs will feature new compositions, written and performed by three generations of women who bring contemporary meaning to this traditional music. Major support has been provided to the Philadelphia Folklore Project by The Pew Center for Arts & Heritage.


    BioArt and Bacteria at the Esther Klein Gallery

  • October 16, 2018
  • vol 65 issue 9
  • Events

  • print
  • A solo exhibition by internationally acclaimed British artist Anna Dumitriu will open at the Esther Klein Gallery on Thursday, October 18. BioArt and Bacteria explores our relationship with the microbial world and the history and future of infectious diseases. An artist lecture will be held on Thursday, October 18 at 5 p.m., immediately followed by the exhibit’s opening reception 6-8 p.m. at the gallery.

    To register, visit

    The exhibit runs through November 24.

    Update: October AT PENN

  • October 16, 2018
  • vol 65 issue 9
  • Events

  • print
  • Talks

    19 The History, Theory and Practice of Administrative Constitutionalism; 2018 University of Pennsylvania Law Review Symposium; 1 p.m.; Penn Law; info and to register: October 20.

    25From Inquiry to Innovation: How a Clinical Question Became a Business Opportunity; Kathryn Bowles, nursing; 3 p.m.; Fagin Hall; RSVP:

    Public Health vs. the Viruses: A Matchup for the Century; CPHI Seminar Series; Anne Schuchat, CDC; 3 p.m.; Rubenstein Auditorium, Smilow Center;(Center for Public Health Initiatives, Penn Dental, Prevention Research Center, Student Health Service).

    AT PENN Deadlines

    The October AT PENN is online. The November AT PENN will be published October 30. The deadline for the weekly Update is the Monday of the week prior to the issue. The deadline for the December AT PENN is November 5.


    Weekly Crime Reports

  • October 16, 2018
  • vol 65 issue 9
  • Crimes

  • print
  • Below are the Crimes Against Persons, Crimes Against Society and Crimes Against Property from the campus report for October 1-7, 2018. View prior weeks’ reports. —Ed.

    This summary is prepared by the Division of Public Safety and includes all criminal incidents reported and made known to the University Police Department for the dates of October 1-7, 2018. The University Police actively patrol from Market St to Baltimore and from the Schuylkill River to 43rd St in conjunction with the Philadelphia Police. In this effort to provide you with a thorough and accurate report on public safety concerns, we hope that your increased awareness will lessen the opportunity for crime. For any concerns or suggestions regarding this report, please call the Division of Public Safety at (215) 898-4482.

    10/2/18 4:02 PM 434 S 42nd St Secured bike taken

    10/2/18 5:00 PM 3900 Walnut St Confidential

    10/2/18 7:39 PM 4000 Locust Walk Complainant assaulted by offender

    10/3/18 1:17 PM 3400 Spruce St Money not deposited in bank

    10/3/18 1:23 PM 3409 Walnut St Unattended backpack and contents taken

    10/3/18 7:53 PM 4100 Walnut St Bike taken/Arrest

    10/4/18 8:22 AM 4207 Baltimore Ave Offender smashed front door window and stole tools

    10/4/18 10:20 AM 3400 Spruce St Patient’s unsecured phone stolen

    10/4/18 1:10 PM 3335 Woodland Walk Keys and cell phone not returned to owner

    10/5/18 2:30 AM 3401 Spruce St Unknown male touched complainant inappropriately

    10/5/18 12:55 PM 3800 Walnut St Secured bike taken.

    10/6/18 2:16 AM 3549 Chestnut St Altercation between boyfriend and girlfriend

    10/6/18 2:41 PM 3603 Walnut St Merchandise taken without rendering payment

    10/6/18 7:31 PM 4039 Chestnut St Items removed from packages

    10/6/18 7:38 PM 3631 Walnut St Phone taken from display

    10/7/18 9:17 PM 3000 Chestnut St Complainant assaulted by partner/Arrest

    18th District

    Below are the Crimes Against Persons from the 18th District: 9 incidents (1 robbery, 1 assault, 1 indecent assault, 2 aggravated assaults and 4 domestic assaults) were reported October 1-7, 2018 by the 18th District covering the Schuylkill River to 49th Street & Market Street to Woodland Avenue.

    10/1/18 8:51 PM 4806 Market St Robbery

    10/2/18 5:00 PM 3900 Walnut St Indecent Assault

    10/2/18 7:59 PM 40th & Locust Sts Assault

    10/3/18 6:21 PM 4901 Chestnut St Aggravated Assault

    10/3/18 9:44 PM 47th & Springfield Ave Domestic Assault

    10/5/18 6:23 PM 4500 Baltimore Ave Domestic Assault

    10/6/18 2:52 AM 3549 Chestnut St Domestic Assault

    10/6/18 11:11 AM 48th & Spruce Sts Aggravated Assault

    10/7/18 9:18 PM 30th & Chestnut Sts Domestic Assault


    A Drug-Free Workplace

  • October 16, 2018
  • vol 65 issue 9
  • Bulletins

  • print
  • The University of Pennsylvania is committed to maintaining a drug-free workplace for the health and safety of the entire Penn community. Drug and alcohol abuse can harm not only the users but also their family, friends and coworkers. As Penn observes National Drug-Free Work Week, please take the time to review the University’s drug and alcohol policies.

    Penn’s Drug and Alcohol Policies

    Penn prohibits the unlawful manufacture, distribution, dispensation, sale, possession or use of any drug by its employees in its workplace. Complete policy details are available online:

    Drug-Free Workplace Policy:

    The University Alcohol and Drug Policy:

    Understanding Addiction

    Addiction is a serious disease, but many effective treatments are available. Visit the Health Advocate at for facts about addiction, recovery and support services.

    Help Is Here

    If you or a family member has a substance abuse problem, we encourage you to seek help. Penn provides free, confidential counseling services for you and your immediate family members through the Employee Assistance Program (EAP). The EAP will assist you with challenges that may interfere with your personal or professional life, including substance abuse.

    For more information about the EAP’s counseling and referral services, visit the Employment Assistance Program web page at or contact the Employee Assistance Program 24 hours a day, 7 days a week at (866) 799-2329.

    You can also refer to Penn’s addiction treatment publication for information about treatment benefits and resources at

    Penn’s Way 2019 Week One Winners and Week Three Prizes

  • October 16, 2018
  • vol 65 issue 9
  • Bulletins

  • print
  • Penn’s Way 2019 Raffle Prize Listing Week One Winners

    Office Depot: Supply Basket ($100); Kara Eller, HUP

    Philip Rosenau Co., Inc.: Walmart gift card ($50); Orjana Kurti, CPUP

    Fisher Scientific: Home Depot gift card ($50); Susan Sorenson, SP2

    Fisher Scientific: Lowe’s gift card ($50); Geoffrey Filinuk, ISC

    Specialty Underwriters LLC: Amazon gift card ($100); Shynita Price, UPHS Corporate

    Philadelphia Eagles: Carson Wentz autographed 8×10 photo ($50); Joanne DeLuca, CPUP

    Week Three Drawing: October 22, 2018

    Visit for more information about the raffle and making a pledge. Entries must be received by 5 p.m. on the prior Friday for inclusion in a given week’s drawing. Note: List is subject to change.

    Sponsor: prize (value)

    Philip Rosenau Co., Inc.: Walmart gift card ($50)

    Fisher Scientific: ExxonMobil gift card ($50)

    Fisher Scientific: Old Navy gift card ($50)

    Philadelphia Eagles: Chris Long autographed Super Bowl LII mini helmet ($30)

    Starr Restaurants: Parliament Coffee Bar gift bag ($75)

    Gift Baskets for Thought: Penn-Themed gift basket ($75)

    Philadelphia Flyers: Signed memorabilia ($35)

    Previous Issue

    Almanac is the official weekly journal of record, opinion and news for the University of Pennsylvania community.

    About Almanac

    © 1954-2018 The University of Pennsylvania Almanac.

    3910 Chestnut St., 2nd Floor, Philadelphia, PA 19104-3111

  • E-mail:
  • Phone: (215) 898-5274
  • Manage Subscription

    How Mindtree CIO is using collaboration technologies to boost employee productivity –

    Mindtree has a challenge that’s common to many organizations. Its assets and resources are spread across multiple geographies. Its 17,000 strong workforce is working on projects from various customers. Collaboration, for Mindtree, is not a choice but a business imperative. And that means more than connecting systems. It means connecting people, processes, and systems through digital technologies.

    Not an easy task, but CIOSubramanyam Putrevu is pursuing the target with an unflinching spirit. He is using IT as a means to foster collaboration, shore up employee productivity and effectiveness, improve decision making and help the business leaders work more efficiently, leaving more time and money for strategic thinking.

    In a wide-ranging interview with ETCIO.COM, Subramanyam Putrevu, CIO of Mindtree talks about the ways in which he is using analytics, IoT, and AI to boost employee productivity.

    Mindtree is an IT services company employing 17,000 employees spread across various geographies. How do you leverage technology to foster collaboration among such a huge workforce spread across diverse locations?

    15 top science & tech leaders offer surprising predictions for 2018 –

    Get the Mach newsletter.

    The past year has been a momentous one for science and technology. From the detection of gravitational waves (predicted almost a century ago by Einstein) to the rise of virtual currencies like Bitcoin to the creation of genetically modified human embryos, 2017 was marked by all sorts of remarkable discoveries and innovations.

    What will 2018 bring? No one knows for sure. But as we did for 2017, we asked top scientists and thought leaders in innovation what they expect to see in the new year. Here, lightly edited, are their predictions.

    Sean Carroll: Understanding quantum spacetime

    Sean CarrollBill Youngblood

    Dr. Sean Carroll is a theoretical physicist at the California Institute of Technology in Pasadena. His most recent book is “The Big Picture: On the Origins of Life, Meaning, and the Universe Itself.”

    I’m going to go out on a limb and predict that we’ll see dramatic advances in understanding the quantum nature of spacetime itself. I won’t make any large bets on this possibility, since theoretical research is notoriously gradual and unpredictable. But the ingredients are in place for moving our understanding substantially forward.

    Quantum mechanics is the wonderfully successful theory of how the world behaves at the microscopic level, while on large scales space and time are wedded together in Einstein’s famous general theory of relativity. Reconciling how both of these ideas can be true at the same time has been a longstanding puzzle for theoretical physicists. Recently, we have been bringing new tools to bear: information theory, the many-worlds interpretation of quantum mechanics, and an improved understanding of black-hole entropy. The time is right to finally figure out the quantum ingredients out of which space and time are made.

    Leroy Chiao: Cryptocurrency takeover

    Image: Leroy Chiao
    Leroy ChiaoNASA

    Dr. Leroy Chiao is CEO and co-founder of OneOrbit LLC, a Houston-based training and education company. He served as a NASA astronaut from 1990 to 2005 and flew four missions aboard three space shuttles and once co-piloted a Russian Soyuz spacecraft to the International Space Station.

    As an astronaut, I am always following developments in space exploration programs, both government and commercial. However, while not directly linked to space, my tech prediction for 2018 is about Bitcoin (BTC) and other cryptocurrencies. I believe that 2018 will see mainstream adoption of BTC in a significant part of the worldwide financial industry. In the coming years, the current 1,300 or so cryptocurrencies will battle it out, with just a few left standing. In the future, BTC or its successor would likely be the currency on the moon and Mars!

    George Church: Big leaps in synthetic biology

    George Church
    George ChurchGeorge Church

    Dr. George Church is a professor of genetics at Harvard Medical School in Boston and director of He is the author of “Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves.”

    The year 2018 will finally see the public embrace million-fold cheaper personal genomes, thanks to better education and awesome software. Leveraging such revolutionary diagnostic costs, therapeutics costs may follow — via radically engineered nutritional supplements, veterinary products, yogurts, citizen science, and preventing wild animals from carrying malaria or Lyme disease.

    We’ll see machine learning applied to drug delivery and to preventive medicine, thanks to shareable, rich, individual-patient-level precision medicine data sources like 2018 will bring initiatives on multiple projects to make synthetic cells safe from all viruses — achieved via methods far more precise and more efficient than current gene-editing methods. Microscopes will blossom with images of chromosomes at super-resolution as well as wide fields of millions of cells retaining intricate connections among nerve cells. We’ll see new data on embryos growing outside of a mouse body and human gene collections which enable forming in the lab any of our body organ systems. Among other things, this will enable new synthetic nerve cells for brain-computer-interfaces, gentle alternatives to invasive electrodes.

    Esther Dyson: Progress in health

    Image: Esther Dyson
    Esther DysonSeth Fisher

    Esther Dyson, a veteran tech and healthcare angel investor, is executive founder of Way to Wellville, a 10-year project to demonstrate the value of investing in health vs. spending on healthcare. It operates in five small communities around the U.S. and is working with local organizations to enhance their capacity to train and deploy local people in caregiving and health-fostering programs.

    In 2018, even as the country’s healthcare system is undergoing great turmoil, we may start looking more closely and use big data to understand what’s really going on. We will learn how to reduce costs — not just the costs of healthcare and drugs, but also of unemployment/low productivity and absenteeism, along with the social costs of poor health, addiction, depression, crime and drug overdoses.

    Traditionally, we’ve used clinical trials in healthcare, but they really don’t work well with population health and social changes, with too many variables to control. Now, with big data, and more data available through everything from health records and fitness apps to public data such as high school graduation rates and population demographics, we are increasingly able to compare what happens with what would have happened without a particular intervention. These interventions include prenatal care, with measurable improvements in birth outcomes and reductions in NICU (neonatal intensive care unit) costs, diabetes-prevention programs now being offered by the YMCA and many other organizations, and mental health/addiction counseling programs (which remain in extremely short supply).

    Progress in healthcare is notoriously slow, so actual practice won’t change that rapidly. But with luck, some communities will lead by example, and policy-makers will take note.

    Oren Etzioni: Artificial intelligence crosses over

    Image: Oren Etzioni
    Oren Etzioni

    Dr. Oren Etzioni is CEO of the Allen Institute for Artificial Intelligence and a professor of computer science at the University of Washington, both in Seattle.

    In 2018, artificial intelligence will cross over from razor-thin, narrow AI — the kind of bespoke AI that beats people at Go and Poker, and other narrowly delimited tasks but must be reconfigured manually for each new challenge — to broader, multipurpose AI systems that can tackle several challenges using the same software.

    For example, we will see a single AI that, once trained, can play multiple very different games, answer questions on topics ranging from politics to science to cooking to everyday life, and more. General AI is still decades away, but razor-thin AI is so very 2017.

    Jacqueline Faherty: The year of the Milky Way

    Image: Jackie Faherty
    Jackie Faherty

    Dr. Jacqueline Faherty is an astrophysicist at the American Museum of Natural History in New York City. She is co-founder of the citizen science project Backyard Worlds, which invites anyone to help uncover undiscovered worlds in the galaxy.

    2018 will be the year of the Milky Way Galaxy. In April, the European Space Agency’s Gaia Mission, among the most ambitious in modern times, will release its second catalog. It will include distances to over a billion stars and velocities for several million. Scientists have waited decades for this 10,000-fold increase in the number of stars calculated with unprecedented positional accuracy.

    Based on this new data, we will be able to generate an exquisitely detailed 3D map of our home galaxy. We will uncover previously hidden structures of stars and traces of recent and long-past star formation. Exotic objects like hypervelocity stars will be revealed; we will be able to trace back and project forward the positions of stars in the nearby solar neighborhood and identify past or future stellar encounters. We will see immediate results after April and get revolutionary insights into how our galaxy formed and evolved.

    Katherine Freese: Cosmic breakthroughs

    Image: Dr. Katherine Freese
    Dr. Katherine FreeseMarica Rosengard

    Dr. Katherine Freese is a professor of physics at the University of Michigan in Ann Arbor and a noted expert on dark matter. She is the author of “The Cosmic Cocktail: Three Parts Dark Matter.”

    Last October, an amazing neutron star merger event was discovered via gravitational waves from the Laser Interferometer Gravitational-Wave Observatory (LIGO), and 1.7 seconds later in 70 different detectors in all different wavelengths of light. That means gravity travels at a speed very close to that of light. Combining all this information has already been used to rule out many models of gravity beyond Einstein’s relativity. As more events are discovered, we will learn more about relativity, and about the numbers of neutron stars and black holes of different masses. The black holes already discovered, up to 30 times the mass of the sun, are a surprise, and we will learn what other masses are out there.

    Events in 2018 will also teach us about what astronomers call H0, the expansion rate of the universe. Right now there is an interesting discrepancy between the value of H0 measured by cosmic microwave background experiments (the early left-over light from the Big Bang) and the value measured from more recent supernovas (exploding stars). The combination of more black hole and neutron star events measured by both LIGO and electromagnetic detectors stands to resolve this issue. Is the discrepancy real? If so, what is the new physics it heralds?

    Lawrence Krauss: Breaking the standard model?

    Lawrence Krauss
    Lawrence KraussJena Sprau

    Dr. Lawrence Krauss is a professor of earth and space exploration and director of the Origins Project at Arizona State University in Tempe. He is the author of nine books, including “A Universe from Nothing,” “The Physics of Star Trek”, and “The Greatest Story Ever Told.”

    Either the Large Hadron Collider or LIGO will observe events which are inconsistent with our current understanding of the standard model of physics and/or black hole physics. This is more a hope than a prediction, because we really need new empirical input if we are to move our ideas about fundamental physics beyond the purely speculative stage.

    Nature needs to tell us the right direction to move in, and these two forefront experiments are the best bets we have, in my opinion. Experiments measuring dark energy are unlikely to reveal anything beyond what we know, and I don’t think the next generation of dark matter experiments will be online in time to report any discoveries next year.

    Ainissa Ramirez: Creating monsters?

    Image: Ainissa Ramirez
    Ainissa RamirezBruce Fizzell

    Dr. Ainissa Ramirez is a scientist and science communicator, now writing a book on how technology transforms us. She speaks worldwide about science and education and hosts the podcast “Science Underground.”

    The year 2018 is the 200th anniversary of Mary Shelley’s “Frankenstein,” in which a scientist neglects to ask about the consequences of his creation. I suspect (and hope) that there will be much debate on the impact of technology on our lives in the numerous lectures and events scheduled this year. It is a long-overdue discussion because scientists sometimes get so excited about their innovations that they forget to ask, “Am I building a monster?”

    This anniversary offers a pause to see if society likes where it is headed. With recent headlines such as a former Facebook leader expressing remorse for his invention, the anniversary of “Frankenstein” provides an opportunity to better understand the impact of our creations.

    J. Marshall Shepherd: Weather forecasting improves

    Image: J. Marshall Shepherd
    J. Marshall ShepherdCourtesy J. Marshall Shepherd

    Dr. J. Marshall Shepherd is the Georgia Athletic Association Distinguished Professor at the University of Georgia in Athens and director of the university’s atmospheric science program. He served as the 2013 president of the American Meteorological Society.

    Weather forecasting is often perceived as guesswork by the public. There is not a meteorologist alive who has avoided jokes about the accuracy of forecasts. But these are misperceptions. The current era of weather forecasts, as witnessed during the society-altering 2017 hurricane season, is quite extraordinary because of rapid advances in meteorological knowledge, satellites, radar systems, and computer models. We now have technology in place to provide significant lead time for landfalling hurricanes, potentially tornadic storms, and multi-day flood events.

    In 2018, I foresee significant strides in the “other” side of the forecast and warning paradigm — the social science of meteorology. The weather community has become cognizant of confusion caused by hurricane cones, tornado warning “polygons,” and “watch-advisory-warning” terminology. Emerging research is exploring how the public consumes, interprets, and acts upon weather messaging, warning colors, and risk factors. I predict that the type of language and symbols used to convey weather warnings will be streamlined and intelligently designed based on factors ranging from psychological theory to cultural norms.

    Seth Shostak: A Super-Earth for our solar system?

    Seth Shostak
    Seth ShostakCourtesy of Seth Shostak

    Dr. Seth Shostak is the senior astronomer at the SETI Institute in Mountain View, California, and director of the institute’s Center for SETI Research. He is the author of “Confessions of an Alien Hunter: A Scientist’s Search for Extraterrestrial Intelligence.”

    “My very educated mother just served us nine pizzas” was the clunky mnemonic used by generations of school kids to remember the nine planets of the solar system. Alas, in 2006, the pizzas were trashed by the International Astronomical Union, which stripped Pluto of its planet status. In the following decade, the sun’s family suffered another indignity when we learned that the most common types of planet in the cosmos are the so-called super-Earths, bulked-up rocky worlds about twice the diameter of our own. Lamentably, our solar system doesn’t have one.

    In 2018, remedies for both these affronts may be in the offing. It’s possible that a replacement for Pluto will be found — not literally a world to take its place, but one that can legitimately be called the ninth planet. Caltech astronomers Mike Brown and Konstantin Batygin have accumulated indirect evidence for an object that’s more than twice the diameter of Earth, but 10 times as far from the sun as Pluto. The hunt is on to find it, and the news will be big if astronomers succeed in bagging this prey. It will add a super-Earth to our solar system — one we can both study and possibly visit — and will give school kids a coveted ninth planet to love. Whatever name is chosen should begin with a “P.”

    Eric Topol: Gene-editing gains

    Image: Eric Topol
    Eric TopolThe Scripps Research Institute

    Dr. Eric Topol is a professor of molecular medicine at The Scripps Research Institute and founder and director of the Scripps Translational Science Institute, both in La Jolla, California. He’s the author of “The Patient Will See You Now: The Future of Medicine is in Your Hands.”

    There are two parallel, powerful technology movements that will ultimately prove to be transformative in medicine: CRISPR genome editing and deep learning artificial intelligence. In late 2017 we saw the first patient receive genome-editing treatment for a rare disease; in 2018, more than 10 different medical conditions are moving forward in clinical trials, including rare eye diseases, hemophilia, and sickle cell anemia. Genome editing has the potential to cure many diseases for which effective treatments have never been available.

    Likewise, deep learning, a subtype of artificial intelligence, is starting to show major potential in medicine. The algorithms have been demonstrated to interpret medical scans, skin lesions, heart rhythm abnormalities, and pathology slides as well as or better than specialist doctors. Deep learning will start to take hold in the clinic, first in ways to improve diagnostic accuracy and efficiency of doctors’ workflow, and ultimately for consumers as a virtual medical coach.

    Sherry Turkle: Social robots fool us

    Image: Dr. Sherry Turkle
    Dr. Sherry TurkleCourtesy Sherry Turkle

    Dr. Sherry Turkle is the Abby Rockefeller Mauzé Professor of the Social Studies of Science and Technology at MIT in Cambridge, Massachusetts, and founder and current director of the MIT Initiative on Technology and Self. Her most recent book is “Reclaiming Conversation: The Power of Talk in a Digital Age.”

    Robots have always seemed like the cavalry that’s called in to do the jobs that put the human body in danger. But we have long had the fantasy that they would do more: that they would be our caring companions. And now our dream comes true. My prediction: In 2018, sociable robots will be able to convince us that they are able to be empathic by passing an emotional version of the Turing Test, a behavioral test we have taken as a measure of machine intelligence. In the Turing test, a machine seems to “think” like a person. But simulated thinking (say, for the purposes of playing chess) may be thinking, but simulated feelings are not feelings, and simulated love is never love. Our “success” in making robots that pretend empathy involves deception with significant consequence.

    Time Magazine’s “invention of the year” was awarded to Jibo, a sociable robot that responds to “Hey, Jibo,” and aspires to be a family friend. With our fantasy-turned-reality comes the problem that we become attached to robots in essentially inauthentic relationships because no matter what they “say,” they have no fondness or compassion to give. And we do get attached. We are vulnerable. Roboticists have learned that nurturance is the killer app. When we teach or care for a robot, we imagine that the robot cares for us in return. When we offer sociable toys and digital pets to our children, we embark on an experiment in which our children are the human subjects.

    Will we be honest enough to confront the emotional downside of living out our robot dreams?

    Moshe Vardi: Silicon Valley’s comeuppance

    Image: Moshe Verdi
    Moshe VerdiTommy LaVergne / Rice University

    Dr. Moshe Y. Vardi is the George Distinguished Service Professor in Computational Engineering and director of the Ken Kennedy Institute for Information Technology at Rice University in Houston. He is the author or co-author of more than 500 papers, as well as two books.

    In 2017, there was a sudden recognition of several adverse societal consequences of information technology, from job losses due to automation to manipulation of public opinion, with significant political consequences. This recognition has been accompanied by a dramatic drop in the public view of Silicon Valley, long considered a hub of innovation and economic growth. This view is expressed, for example, by Wall Street Journal columnist Peggy Noonan, who referred to tech’s CEO’s as “our country’s real overlords” and described them as “moral Martians who operate on some weird new postmodern ethical wavelength.”

    I expect this to become a major point of discussion in 2018, both inside the technology community, struggling to cope with its social responsibility, and at various levels of government, opening a discourse about the possible regulation of technology. We will hear more regrets from founders of tech companies about the addictive technologies they have launched. We will start a discussion of cryptocurrencies as a shadow banking system, which should be curtailed or tightly regulated.

    Wendell Wallach: Tech turns deadly

    Image: Wendell Wallach
    Wendell WallachChion Wolf

    Dr. Wendell Wallach is a senior advisor to The Hastings Center and chair of technology studies at Yale University’s Interdisciplinary Center for Bioethics in New Haven, Connecticut. His most recent book is “A Dangerous Master: How To Keep Technology from Slipping Beyond Our Control.”

    A serious tragedy will direct the attention of international leaders, under public pressure, to finally take on the difficult but incredibly necessary task of putting in place effective oversight and governance of emerging technologies. The tragedy may be the result of a broadly deployed cyberweapon that brings down critical infrastructure like a power grid, or a database hack that exposes sensitive user data. A terrorist or crazy kid may arm a drone with poison, killing innocent people. The proposed treaty to limit the development of lethal autonomous weapons will suddenly become more likely, as will stringent controls on the handling of sensitive user data by social media and other companies. Industry leaders, fearful of more stringent restrictions on their activities, will lead the way for thoughtful oversight of digital technologies.

    I may be wrong — nevertheless, reaping the benefits of innovation and managing risks must happen together.

    3i Infotech on the cusp of a major growth trajectory: Padmanabhan Iyer, MD & Global CEO –

    Bangalore: 3i Infotech has envisioned major growth plans for itself, in the backdrop of strong tailwinds experienced over six successive quarters through the last two years. With a revenue of over INR 1000 crores in the last 4 years, the Company has added 139 new customers in FY 17, growing its customer portfolio to more than 1000 customers across 50 countries in 4 continents.

    The Company has implemented a 3 phase ‘Protect-Consolidate-Grow’ approach that has been successful and is reflected in the operating margins of the Company. The approach has helped the organization retain its existing customers and win new ones. While maintaining a stable margin of 18 to 20 per cent, the Company improved its CRISIL rating to ‘CRISIL BB/Stable’.

    The Company reported a net profit of INR 100.65 crores in FY 2017, with an improved EBITDA over the past 5 years, while achieving an order-book balance of INR 572 crores as on March 31, 2017. The cash flow from operations has been positive and the Company’s net worth as on March 31, 2017 was INR 370.09 crores. The Company continues to be profitable and has reported a net profit in H1-FY2018.

    Positioning itself as a one-stop, Next-gen IT enterprise, the organization has a balanced mix of business from across all geographies, globally, with Americas (32%), EMEA (22%) & APAC (46%). The organization continues to strengthen its dominant presence across emerging markets, including MEA, APAC and India with continued investments to grow its US business.

    Wellbeing technology in the workplace: a guide – Personnel Today

    In the first part of a new major series on wellbeing technology in the workplace, Stephen Haynes outlines the range of platforms and devices available. (See part 2 and part 3)

    Technology related to health and wellbeing, or wellness, has long played a role in workplace health, but it has become more prevalent in recent decades with the move towards more proactive health management programmes, and even more so in the past decade with the growth in technological advances and greater ease of connectivity.

    When we think about technology in the context of health and wellbeing, we might think about wearable devices or smartphone applications, but there are a wide range of solutions. Recent decades have seen a number of advances, such as improvements to occupational health systems, attendance management software, health risk assessment tools and wearable devices.

    Artificial intelligence in healthcare

    Artificial intelligence (AI), or as Daniel Kraft, chairman for Medicine at Singularity, prefers to call it “augmented intelligence’” because it is about using technology to add and enhance healthcare, not replace the human aspect, is playing a huge role in the delivery of healthcare today and will do so in the future.

    IBM Watson is supporting physicians using cognitive computing AI technology to recommend cancer treatments in remote areas.

    Meanwhile, robotics are being used by doctors in delivering treatment, and Google Brain is using machine learning, working with hospitals to predict health outcomes from medical data so it can ultimately train computers to predict when people may get ill.

    The evidence base is also growing; for example, a study last year by Stanford University found that AI could identify skin cancer in photographs with the same accuracy as trained doctors, which could potentially enable a smartphone to act as a cancer scanner.

    It is important to distinguish AI from machine learning, a form of AI that enables computers to learn without being specifically programmed.

    Machine learning is part of everyday life and can be found in things like spoken commands to a smart phone, which relies on tech supported by machine learning, or virtual personal assistants (for example, Siri, Cortana and Google Assistant).

    A 2016 study by Harvard Medical School sought to understand whether digital platforms could be more accurate in diagnosing conditions compared to actual doctors. It showed that doctors made a correct diagnosis more than twice as often (72%) as online symptom checkers (34%). However, this study was recently criticised for including some known overly-poor performing symptom checkers.

    A study in 2016 of Babylon Health’s automated triage system (Middleton et al, 2016) showed an accurate outcome in 88.2% of cases compared to 75.5% of cases for doctors, providing “a clinically safe outcome in 100% of cases, and performed an accurate triage in up to 90.2% of cases”.

    It was also quicker at diagnosing and triaging in almost 90% of cases, halving the time to triage for the average case assessed.

    In July 2016, Babylon released its first AI-enabled symptom checker and in January 2017, partnered with the NHS to use this technology to power an NHS 111 app available to over a million north London residents.

    Workplace health surveys and trends point to a key role for technology over the next couple of years, including: effective adoption of meditative practices into the workplace; financial education; support for the employee as a carer; improvements to the communication and integration of benefit and support services (including apps and platforms); and taking a more joined-up, strategic approach to the alignment of employee wellbeing with company mission, vision and values.

    Defining health and wellbeing technology

    Workplace health interventions are generally considered to fall into one of three categories: preventive, supportive or rehabilitative. Workplace health-related technology can be broadly categorised as:

    • platforms;
    • trackers;
    • workforce monitoring;
    • coaching and consultation; and
    • occupational health and safety.

    Platforms, trackers and workforce monitoring tools are mainly focused on prevention and support, while coaching and consultation and OH and safety are typically more supportive and rehabilitative in their nature. The latter, more familiar workplace health-related technology includes absence management platforms, OH-based technologies and tools adopted in health and safety.

    Platforms, trackers and workforce monitoring

    Platforms, or wellbeing portals, have been around for years, particularly in the US, and simpler versions of health and wellbeing programmes have also been found in employee benefit platforms.

    The role of the platform is to integrate an organisation’s programme, educating, capturing data, communicating and engaging all the relevant stakeholders across the business. There is much cross-over between platforms, trackers and workforce monitoring tools. Most platforms are able to capture data from wearable or app-based activity tracking devices. While some platform providers used to offer their own wearable device, this has become less prevalent as they recognise that most people that track use their own.

    Platforms capture information from employees, either input by themselves or automatically through an automated tracker (mainly using personal devices and/or smartphone “mHealth”, but also some employer-provided tracking devices).

    These devices deliver tailored plans, coaching and support to staff, with aggregated data to the employer to feed into programme design. Some platforms enable integration of wider benefits (for example, financial and pension information), some offer more sophisticated data analysis, while others allow employers to schedule and track wellbeing-related activities, such as wellbeing challenges.

    Currently, platforms tend to come from three main channels:

    • Providers in the health and protection benefits space (insurers such as AXA PPP’s Health Gateway, Vitality’s Active Rewards, and Bupa Boost; and employee assistance programme providers such as Optum’s LiveWell platform).
    • Employee benefit advisers/platforms (Mercer Harmonise or Reward Gateway’s Yomp platform).
    • Wellbeing companies.

    Wellbeing companies include Virgin Pulse (which acquired the Global Corporate Challenge recently); SAAS provider Welltok (which recently acquired Keas); Limeade; Provant; Ceridian Lifeworks; Dadacoo; Umanlife; Wellness Checkpoint (from InfoTech – principally a health risk assessment vendor); Sonic Boom Wellness; Ritualize; WellBe Solutions; CoreHealth Corporate Wellness; CHC Wellness; Wellness Corporate Solutions; and RedBrick Health.

    We are beginning to see the introduction of artificial intelligence (AI) “chatbot” technology into eHealth – programmes that communicate through text or audio, simulating how a human would behave in conversation.

    More advanced technology means much of this passes what is referred to as the “Turing test’”, referring to the machine’s ability to behave equivalent to, or indistinguishable from, a person, which can create greater trust and reliance on the output of information.

    Babylon Health, the on-demand remote healthcare support provider, known for its remote GP services, is enhancing its existing AI-driven technology, which already uses texts from users to triage patients, whether connecting to a virtual GP or advising them to visit A&E, combined with providing test results and supporting health information.

    A growing number of platforms and workforce monitoring technologies capture employee feedback in real-time. This addresses the traditional annual survey, which only captures a particular point in time, or an employee’s original intent.

    Soma Analytics and Psychological Technologies provides bespoke interventions to support employees based on their inputs, while providing employers with aggregate wellbeing data of employees. Of course, real-time capture is only as good as the quality and quantity of the input, as well as how you interpret the data and respond.

    Some providers go beyond simply offering the technology to capture the data, to offering metrics and even action plans based on the combined outputs, with real-time recommendations, advice and guidance back to line managers and HR. Some companies, such as Glint, use AI and natural language processing, which can capture and analyse qualitative data from multiple sources.

    Wellbeing expert Professor Cary Cooper sees real-time monitoring technologies playing a key role in the future of workplace health. He says: “Smartphone and wearable tech can already enable staff and employers to capture and manage data on how people perceive their working environment and let them give feedback in real time – without the employee having to do anything.”

    Cooper adds that Network Rail has already developed technology that enables it to monitor its people in real time.

    Nancy Hey, director of the What Works Centre for Wellbeing, says: “There are work-related aspects that we have not properly investigated from a wellbeing or performance angle yet. For example, the way workers take breaks is different today compared with 15 years ago; and what effect have home working and hot desking had? In many cases, the data is probably already there in terms of people survey information.”

    Hey adds: “There are a small number of employers that really look under the skin of their workforce to understand both the engagement and wellbeing drivers – for example, the Civil Service People Survey provides a wealth of information over a five-year period so it can monitor the impact of workplace trends such as redundancies and organisational change on worker wellbeing – measuring mood, purpose and job and life satisfaction.”

    Terminology and health and wellbeing technology

    There is much debate about how to categorise different types of health and wellbeing technology, from consumer and wider healthcare perspectives.

    The World Health Organisation (WHO) defines eHealth as “the use of information and communication technologies [ICT] for health”, while mHealth is seen as a component of eHealth which the Global Observatory for eHealth (GOe) defines as “medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants (PDAs), and other wireless devices”. More specifically, mHealth taps into smartphone capabilities such as SMS, GPRS, GPS and Bluetooth, as well as accelerometers – sensors that measure the tilting of a mobile phone.

    In the September 2015 report “Patient adoption of mHealth”, Murray Aitken, from the IMS Institute for Healthcare Informatics, noted that the number of mHealth apps available to consumers exceeds 165,000, with the number of iOS apps increasing by more than 100% between 2013 and 2015.

    The report adds that most apps concentrate on wellbeing, diet and exercise, with about a quarter focusing on disease and treatment management. However, the report notes that more than half of mHealth apps have limited functionality and most simply provide information.

    The study found these apps are increasing capability to connect to other devices or sensors (defined by Wikipedia as an electronic component, module, or subsystem whose purpose is to detect events or changes in its environment and send the information to other electronics), as well as social media connectivity and connectivity and communication with provider healthcare systems (albeit only 2%).

    In addition to smartphones, watches and wearables, some of the more relevant sensor categories that experts believe will be most applicable for mHealth app solutions in the next five years include implantable sensors (for example, glucose detection for diabetes control), and ingestible sensors, which are those that are swallowed.

    There are about 260,000 mHealth smartphone apps currently available, and more than half are targeted at people with chronic conditions, such as hypertension, heart disease, cancer, depression and diabetes, according to Research2Guidance published in 2016.

    From the 3,677 respondents to the 2014 National Cancer Institute’s Health Information National Trends Survey (a cross-section of US citizens’ use and access of health information), Carroll et al identified that the main users of health apps are younger, with high levels of education, health and income.

    The survey also showed that app use is associated with, but not necessarily validated by, intentions to change diet and physical activity.

    Despite advances in the nature and volume of health apps, there are still limitations – in particular, the limited evidence of clinical effectiveness and a lack of integration with healthcare systems.

    Apps don’t just focus on wellness, diet, exercise or disease and treatment management. Some of the popular work/productivity-related apps (not necessarily evidenced based) include Headspace, which helps improve focus and concentration, and Focus Booster, which breaks down work into intervals separated by short breaks.

    There are numerous stress management apps, including Worry Watch, which lets you write down what is bothering you, track whether or not the outcome was as bad as you originally thought and then assess the trends and patterns in your anxiety visually.

    Telemedicine: Broadly defined as remote diagnosis and treatment of patients via information and communication technologies. The WHO considers it an evolving science because it is constantly incorporating new advances while adapting to changing societal health needs and contexts.

    The WHO has a broad, all-encompassing definition of telemedicine, which is: “The delivery of healthcare services, where distance is a critical factor, by all healthcare professionals using information and communication technologies for the exchange of valid information for diagnosis, treatment and prevention of disease and injuries, research and evaluation, and for the continuing education of healthcare providers, all in the interests of advancing the health of individuals and their communities”.

    Telehealth: Often the terms telemedicine and telehealth are synonymous and used interchangeably, although there is much debate as to whether or not they mean the same thing. The Telecare Services Association defines telehealth as: “the remote exchange of data between a patient at home and their clinician(s) to assist in diagnosis and monitoring”.

    Telehealth is commonly used to support patients with long-term/chronic conditions such as chronic obstructive pulmonary disease; diabetes; and epilepsy. Telehealth solutions give patients greater control over their health by monitoring and transmitting information about their condition in real-time to a central point where it can be assessed against baselines set by their doctor.

    When measurements fall outside these parameters, the telehealth provider can instigate an appropriate response. It also enables doctors to monitor a patient’s condition and intervene remotely. It is widely proven to reduce emergency and hospital admissions as well as mortality rates.

    Telecoaching: While no clear definition of telecoaching exists, it is a subset of tele­medicine that uses communication and technology devices such as phone, computer or tablet to provide remote treatment and advice. Generally speaking, the remote patient monitoring elements of telemedicine and telehealth would be considered too invasive for the employer/employee relationship, but we do see elements of these types of technology in workplace health settings. These are commonly found in the form of remote GP/specialist services, counselling services and online medical record services.

    Pre-primary care: While not in itself technology specific, the concept coined by Your.MD in its 2016 report “Pre-primary care: an untapped global health opportunity” (Carr-Brown J, Berlucchi M, 2016) is worthy of mention as it seeks to encompass any form of initiative or service intended to help people understand what’s wrong with them and connect them to relevant healthcare services.

    As Professor Sir Muir Gray says: “The term ‘primary care’ is a misnomer. The first thing citizens and patients do is think what they can do for themselves, the second is to seek advice from friends and family, and in the past 20 years, the internet. Then they seek professional help.”

    Convergence of wellbeing technology

    All of the platforms vary in scope and focus, and are getting smarter. It would appear we are converging to the portal in the near future that uses AI combined with machine learning to assess multiple employee engagement, motivation, health and wellbeing factors.

    This can be used to target specific and tailored interventions, coaching and support using chatbot technology (which supports conversations through voice or text) to individual employees, across diverse and spread-out workforces.

    If the industry can work together effectively, then arguably technology could enable data to be captured and interpreted, from attendance data to stress risk assessments to engagement and health insurance expenditure. Devices will be broadly compatible between health benefit providers, OH and attendance management platforms, as well as personal tracking devices and apps.

    Of course, all this should be wrapped up in a simple-to-use and easy-to-administer tool for staff and programme managers. The technology exists to make it happen already – and in many cases, the demand is being driven by a small number of forward-thinking employers and technology solutions providers, as well as the payers and insurers.

    Stephen Haynes is a workplace health specialist, and currently the programme lead for Mates in Mind, which provides the UK construction industry with a framework to improve mental health by raising awareness and addressing the stigma of poor mental health.


    Carr-Brown J and Berlucchi M (2016). Pre-primary care: an untapped global health opportunity.

    mHealth New horizons for health through mobile technologies (2011). Global Observatory for eHealth series, vol.3. World Health Organisation.

    Middleton K, Butt M, Hammerla N, Hamblin S, Mehta K, and Parsa A (2016). “Sorting out symptoms: design and evaluation of the ‘Babylon check’ automated triage system”. Nursing Education Perspectives March/April 2017, vol.38(2), pp.108-109.