Mission Health M.D. Leader on The Power of Technology to Improve Health and Bring Humanity Back to Medical Practice – Healthcare Informatics

Mission Health Chief Quality Officer Chris DeRienzo, M.D., details the organization’s analytics journey to improve clinical outcomes
Click To View Gallery

In an increasingly digitized healthcare world, it is widely understood that data and analytics are the backbone to many clinical and operational improvement efforts. At Mission Health, a seven-hospital health system based in Asheville, North Carolina, senior executive leaders have created a culture of continuous improvement grounded in analytics.

“To drive continuous improvement using analytics, the equation involves the people, the process in which those people are working and the data and the technology that support the people and the process. The analytics are really only just one part of the story, but at Mission Health, I think we’ve shown that they can be a critical component, a catalyst if you will, to drive the reaction of improved outcomes, clinically and operationally,” Chris DeRienzo, M.D., chief quality officer at Mission Health, says.

Mission Health serves communities across 18 counties in western North Carolina with 800 employed providers across 140 practices as well as an accountable care organization that includes hundreds of physicians and more than 90,000 patients. By leveraging analytics, Mission Health has made improvements in areas such as readmissions reduction efforts, improving sepsis, stroke, and heart failure outcomes, and scaling the preventative care needed to succeed in ambulatory population health.

Foundational to this work, DeRienzo says, is having a reliable enterprise data warehouse and analytics environment, as well as clinical program leadership and a team of Lean engineers. The first step, he says, is to align analytics efforts with the organization’s core purpose, what Mission Health executives refer to as the “Big(ger) Aim”— “getting every person to their desired outcome, first without harm, also without waste and always with an exceptional experience for each person, family and team member.”

During an interview with Healthcare Informatics, DeRienzo outlined several analytics projects that demonstrate the organization’s progress, to date, in creating a culture of data-driven continuous improvement and harnessing analytics to drive clinical and operational performance.


Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI…

And, as a result of this work, Mission Health has seen, as of February 2018, an across-the-board drop in readmissions, a 58 percent increase in sepsis detection; a 32 percent reduction in severe sepsis mortality rates; a 20 percent increase in on-time surgery starts; 12 lung cancer deaths avoided and a 37 percent in lung cancer screening; and other improvements across population health outcomes—all while realizing a 64 percent reduction in staff hours to collect data and prepare reports.

Data-Driven Care Process Models Drive Results

In an effort to address clinical variation, Mission Health clinical and executive leaders set out several years ago to develop care process models (CPMs). “We started by bringing together the clinical teams to help identify the best practice, the informaticists to help bake those best practice workflows into the electronic health record (EHR) in as frictionless a way to use as possible, and then an analytics team to measure, not only utilization of the pathway but also the outcomes that we’re trying to drive. That’s the core recipe of a CPM,” DeRienzo says.

In the first nine months of 2017, 24 CPM teams were created that involved clinical leaders, both physicians and nurses, as well as members of the Mission Analytics team, performance improvement experts and committees to review care plans. Those 24 CPMs focused on clinical issues such as COPD (chronic obstructive pulmonary disease) exacerbation, breast cancer screening, depression, hypertension and chest pain, just to name a few. As part of this work, the analytics team created dashboards that provide real-time data and analytics on a provider and patient level and to track performance.

“As we’ve begun to scale that work, the analytics have really helped us to drive the kind of conversations that we need to have, both to improve our care process models themselves as well as to drive adherence and then show how our CPMS are helping to change outcomes for patients,” DeRienzo says.

Speaking to just the COPD exacerbation CPM, DeRienzo says, “Once folks use the COPD exacerbation CPM, they have 100 adherence to the goal guidelines. We’re now up to over 80 percent utilization, and we have modeled that if we got to 100 percent utilization, we would reduce a number of ED visits and a number of inpatient hospitalizations, and yield something on the order of just under $200,000 in reductions in direct costs.”

Mission Health now has 60 CPMs live across inpatient and ambulatory operations and is on track to have 80 models live by the end of the year. “We’re bringing one new CPM live about every two weeks,” he says. “The result has been thousands more patients screened for cancers, reductions in mortality and reductions in direct costs. It’s across the board,” DeRienzo says.

DeRienzo credits Mission Health’s board and senior leadership for its ongoing investment in analytics to enable the organization to drive forward with these efforts. “They are believers in how technology can be leveraged to improve outcomes, and it’s required millions of dollars of investment in infrastructure, in an enterprise data warehouse, and in people time to build the clinical and administrative team needed to do something with the data,” he says.

He also credits Mission Health’s success to “an alchemy of people, processes and technology,” adding, “Our clinicians, the PI team, the informaticists, the analytics people, everybody here is here for the right persons, they really care. My role is to help equip them with the teams and the technologies needed to get to where they want to go and that creates an enormous capacity for change.”

Readmissions Prediction Tool Leveraging Machine Learning

With an eye toward keeping patients healthier after hospital discharge and to reduce readmissions, Mission Health leaders initiated a project to use data and analytics to automate the calculation of a risk model for 30-day inpatient readmission.

“This was our first foray into the machine learning space; trying to leverage the power of data to better target patients who had been discharged from the hospital for our care managers to focus on and to try to keep them safe at home,” DeRienzo says. “This gets to where I think the power of technology is going, which is leveraging data and analytics to decrease the amount of time that humans have to spend doing things that humans don’t absolutely have to do in healthcare.”

Mission Health’s data science team was tasked with creating a risk model that “beats” the LACE index (LACE is based on four factors, length of stay, acuity of admission, co-morbidities and emergency room visits). The project team also wanted a model that would provide clinicians with predictions by 8 am for every patient discharged in the past 24 hours indicating patient’s likelihood to be readmitted compared to a baseline model. Once the model was created, project leaders worked with the care management team to implement it back in February, DeRienzo says. Throughout the pilot phase, the risk model and the user interface in the dashboard have been continuously refined based on user feedback.

“Six months from now, we will have worked through the implementation science around using it as optimally as we use it in practice, and my strong suspicion is that six months from now, we’ll see a meaningful impact on our outcomes,” he says, adding, “Care managers spend a good portion of their time right now figuring out which patients to focus on, and this model helps them spend less time focusing on which patients to help and actually spend more time helping patients. And in the short term, as we piloted that in the first 90 days, they are now seeing a whole new universe of people who were opaque to them before.”

DeRienzo says this project exemplifies the potential of machine learning technology in healthcare. “To me, technology can help return some humanity to the way we practice medicine. And, when it does that best is by reducing the amount of time that humans spend doing things that we don’t necessarily have to do,” he says. “When you strip away all the things that humans don’t have to do in healthcare, really what you’re left with is sort of the raw things that make us human, such as my interactions with you as a patient.”

As health systems and provider organizations look to move forward with machine learning and data analytics work, DeRienzo recommends that healthcare leaders should recognize that analytics is only one part of the equation. “The power of your analytics is directly related to the people who are going to use them, the process in which you are working and you have to begin with that end in mind. In our experience, starting our analytics journey with work that was core to our mission, core to our purpose, really gave us enough runway to then try and fail, try and fail and then try and succeed and success breeds more success.”

And, he says, “Technology and process exist in healthcare to help humans do more of what only humans can do. That analytics and the tools that we can use, the potential for leveraging AI, all is incredible, but fundamentally, at its core, healthcare is about people caring for other people. And I think if folks remember that and if that’s at the center of how they approach their technology decisions, there will be a lower risk of them losing their North Star.”

Trump admin continues privacy listening sessions – Politico

With help from Nancy Scola, Margaret Harding McGill and John Hendel

THE TRUMP ADMINISTRATION’S PRIVACY LISTENING SESSIONS — Three dozen or so tech industry reps were at the Commerce Department on Wednesday to talk online privacy in a conversation led by NTIA, per the Information Technology Industry Council (ITI), whose spokesman, Jose Castaneda, described the session as one on “how to best protect personal privacy while also responding to consumer demand for innovative products and services.” International Trade Administration and NIST officials also took part.

Story Continued Below

— The meeting is part of a rolling administration initiative to figure out how best to cope with the new world order of online privacy, which is being shaped by the EU, with its sweeping GDPR regulation, and California, with its aggressive privacy law passed last month. With growing public attention on what happens to consumer data, industry and administration officials are trying to navigate what they see as a tolerable path forward on privacy.

— The mood in the industry: When California’s privacy law came up during POLITICO’s panel on AI earlier Wednesday, ITI President Dean Garfield responded with a quick “ugh.” Though he did not elaborate on the remark, he later warned of “fragmentation” on privacy regulation at the state level. “What I would suggest is moving quicker in trying to come up with certain standards and norms that are broadly applicable so we don’t have that,” he said.

— The mood in the administration: Walter Copan, director of the National Institute of Standards and Technology, discussed Europe’s new privacy legislation in less-than-glowing terms at the panel. “Clearly we’re in the regime of GDPR which has been certainly foisted on not only Europe and European citizens,” he said. “So dealing with that patchwork of patchworks, if you will, it’s important for government to have an important voice, a clear voice, but also to be part of the American system and the creation of standards that truly reflect the principles of this nation, a free market economy and fairness.”

GREETINGS AND WELCOME TO MORNING TECH, where your host is begrudgingly adopting les bleus in the World Cup as a thank you for four glorious months abroad back in school. Got any tech or telecom tips? Drop me a line at clima@politico.com or @viaCristiano. Don’t forget to follow us @MorningTech. And catch the rest of the team’s contact info after Quick Downloads.

IN CASE YOU MISSED IT — POLITICO hosted a conversation on the role of government and its implications for AI growth in national public safety, privacy and civil rights. Watch the full video here to see how artificial intelligence is accelerating rapidly — from social media bots to facial recognition technology to driverless vehicles.

TECH GIANTS HILL-BOUND — Google and Facebook plan show up to the latest congressional hearing devoted to conservative claims of ideological bias on social media, Ashley reports for Pro. And according to a congressional aide, officials for Twitter are also expected to be on hand. The companies declined to confirm who will be representing them at the July 17 hearing before the House Judiciary Committee, but the congressional aide said “executives” from Google and Facebook are expected to appear. It will be the second such hearing by the committee, which in April heard testimony from pro-Trump social media personalities Diamond and Silk about their alleged mistreatment at the hands of Facebook. (For more on that chaotic spectacle, which MT won’t easily forget, read here.)

** A message from New T-Mobile: There is a technology revolution on the horizon, and it’s called 5G. The race to lead the 5G economy requires a new type of company to drive competition, disrupt the status quo, and help ensure America leads the way in this rapidly changing digital era. Learn more: https://newtmobile.com/**

AGREE TO DISAGREE ON ZTE — The Chinese telecom giant is one step closer to ending the U.S. ban imposed by the Trump administration. The Commerce Department announced that ZTE would put $400 million into escrow — money to be seized by the U.S. should the company violate the terms of the settlement, John reports for Pro. In exchange, Commerce will sign off on lifting a ban on the company imposed earlier this year in response to its alleged illegal sales to North Korea and Iran. In recent weeks, ZTE has shaken up its leadership structure to comply with U.S. demands, even as lawmakers have threatened to upend the arrangement.

— The news did little to quiet critics in the Senate, which last month approved a measure to reimpose the penalties originally enacted against ZTE by the Trump administration. “Allowing ZTE to resume business is a direct betrayal of @realDonaldTrump’s promise to be tough on China & protect American workers,” tweeted Senate Minority Leader Chuck Schumer. Expressing “grave concerns” over the “sweetheart deal” struck by between the White House and ZTE, Sen. Mark Warner (D-Va.) said the telecom giant — whose ties to the Chinese government have drawn scrutiny — “presents an ongoing threat to our national security.” Members of the House and Senate are hashing out differences between their must-pass defense bills, both of which included language addressing the ZTE sanctions.

BEWARE OF BEIJING ON AI? — China’s technology progress took center stage during yesterday’s POLITICO panel on the future of artificial intelligence, moderated by Nancy. Officials and tech experts sounded off on the potential threat posed by Beijing’s embrace of emerging technologies. “Our private sector is unrivaled,” said Rep. John Delaney (D-Md.). But by skirting ethical standards, he said, China has risen to challenge the U.S. economically and technologically. “We have to put them in a position where they play by the rules,” he said.

— AI’s bias problem: The panelists also discussed a hot-button topic for the field of AI — how the technology could be used in a way that fuels discrimination and bias. “One of the biggest concerns in artificial intelligence is bias,” said Rep. Will Hurd (R-Texas), adding “you have to make sure there’s not bias going into the development of those algorithms.” Rashida Richardson, director of policy research at NYU’s AI Now Institute, said it’s important to incorporate different viewpoints when developing policies around AI. “When you don’t have a lot of people in the room or even people that understand these varying degrees of how we understand these terms, then you end up in situations where you can have a tool that is said to be perfect, not biased, but it isn’t,” Richardson said.

HOUSE DEMS, FCC BRAWL OVER CONSUMER COMPLAINTS — FCC officials told MT that a letter from two top House Energy and Commerce Democrats voicing concern over a proposed rule change to how the FCC handles consumer complaints was “completely inaccurate.” In a letter to FCC Chairman Ajit Pai, Democratic Reps. Frank Pallone (N.J.) and Mike Doyle (Pa.) argued the move “would eliminate the agency’s traditional and important role of helping consumers in the informal complaint process.” But FCC press secretary Tina Pelkey said there was no such language in the proposal. Pelkey said the action “would simply align the text of a rule with longstanding FCC practices that have been in place for years under prior Chairmen and Commissions.”

— Dems not backing down: In a statement provided to MT, Doyle said,“The spin coming out of the FCC does nothing to change the fact that the Chairman is proposing to modify the informal complaint rules in a way that would hurt consumers.” And a Democratic E&C committee aidesaid that by eliminating specific language alluding to the “review and disposition of the matters raised” by complainants, the agency was striking a clause “key to that informal review process.” The FCC is scheduled to take up the proposed rule change at today’s open commission meeting at 10:30 a.m. — but a report emerged Wednesday indicating the vote on the original proposal may be scrapped. Two Democratic committee aides told MT they were not aware of a change as of Wednesday evening.

ALSO ON DECK The FCC will vote on Republican Commissioner Mike O’Rielly’s proposal to relax rules requiring broadcast TV stations to air educational programming for children. The so-called Kid Vid rules require TV stations to carve out an average of three hours of airtime each week for shows that boast regular children’s episodes, each at least a half-hour long. O’Rielly’s proposal would end the 30-minute mandate and let broadcasters use secondary subchannels to air the shows.

— But O’Rielly is facing criticism from Democratic lawmakers, including Massachusetts Sen. Ed Markey, who said at a press conference Wednesday that the FCC should instead open up an inquiry to investigate how children are benefitting from the current rules before launching a rulemaking. “We shouldn’t do deregulation just for the sake of deregulation, and especially not for rules that impact our nation’s children,” Markey said. Expect O’Rielly to emphasize that this is just the beginning of the process, and that calls to change the item to an inquiry are nothing more than “Washington speak for injecting unnecessary delay and distraction,” according to a preview of his remarks shared with MT.

— Another item up for a vote is a proposal to explore how to make spectrum available in the so-called C-band for wireless use. The spectrum is currently used to deliver programming to television and radio broadcasters, and the proposal has prompted concerns from NPR about the potential impact on public broadcasting.

MID-BAND SPECTRUM MOVES — Reps. Doris Matsui (D-Calif.) and Brett Guthrie (R-Ky.), who chair the Spectrum Caucus, pressed FCC Chairman Ajit Pai to look at ways to open the 6 GHz band of wireless airwaves for unlicensed uses like Wi-Fi. Although tech companies favor such action, telecom incumbents like AT&T are less enthused. Pai has said he plans to move forward on a rulemaking for the spectrum in the fall.


— Unpro-tech-ted: With President Donald Trump and Chinese President Xi Jinping trading further tariff threats, Silicon Valley has been left particularly vulnerable, Bloomberg reports.

— Just another day…: Facebook allowed a Kremlin-linked Russian internet company to collect data on unknowing users on its platform, CNN reports.


— White neighborhoods are more likely to reap the benefits of Airbnb visitors than their black and Latino counterparts, according to a new study, via the Washington Post.

— Putin a tough spot: Facebook’s algorithm temporarily identified 65,000 Russians as interested in treason, a label that could have put them in risk of retaliation from President Vladimir Putin’s repressive regime, according to The Guardian.

Tips, comments, suggestions? Send them along via email to our team: Eric Engleman (eengleman@politico.com, @ericengleman), Kyle Daly (kdaly@politico.com, @dalykyle), Nancy Scola (nscola@politico.com, @nancyscola), Margaret Harding McGill (mmcgill@politico.com, @margarethmcgill), Ashley Gold (agold@politico.com, @ashleyrgold), Steven Overly (soverly@politico.com, @stevenoverly), John Hendel (jhendel@politico.com, @JohnHendel) and Cristiano Lima (clima@politico.com, @viaCristiano).

** A message from New T-Mobile: New T-Mobile will spark the 5G economy, assert American leadership, and supercharge competition. It will accelerate the country’s position and quickly deploy a broad, deep nationwide 5G network that will deliver unprecedented services at lower prices. Important investor information available at https://newtmobile.com/ **

Chipmaker Broadcom inks US$19 billion deal to buy CA – Computerworld Australia

Broadcom Inc has announced a $18.9 billion deal to buy U.S. business software company CA Inc, venturing far beyond its realm of semiconductors and testing investors’ confidence in its chief executive Hock Tan’s deal-making credentials.

The CA deal, outlined in a joint statement from the companies, comes just four months after U.S. President Donald Trump blocked Broadcom’s $117 billion hostile bid for semiconductor peer Qualcomm Inc, arguing it posed a threat to US national security and gave an edge to Chinese companies looking to build next-generation wireless networks.

Since then, Broadcom has redomiciled from Singapore to the United States, placing it formally outside the purview of the Committee on Foreign Investment in the United States (CFIUS), the government panel that reviews deals for potential national security risks.

Dealmaking has been key to Broadcom’s expansion, as it grew from a four per cent share of the chip market in 2013 to a 30 percent share this year, thanks to acquisitions spearheaded by Tan with backing from private equity firm Silver Lake.

Tan’s selection of CA as Broadcom’s next acquisition target, however, took Wall Street by surprise, and drove Broadcom shares down seven percent in after-hours trading. Investors and analysts scrambled to identify potential synergies, as the deal looked more like a financial investment rather than a combination of complementary businesses.

Hammer blow for Huawei as 5G ban looms in Australia – Computerworld Australia

Australia is preparing to ban Huawei Technologies from supplying equipment for its planned 5G broadband network after its intelligence agencies raised concerns that Beijing could force the Chinese telco to hand over sensitive data, two sources said.

Western intelligence agencies have for years raised concerns about Huawei’s ties to the Chinese government and the possibility that its equipment could be used for espionage.

But there has never been any public evidence to support those suspicions.

Huawei, the world’s largest maker of telecommunications network gear and the no.3 smartphone supplier, has promised that Canberra will have complete oversight of 5G network equipment, which could include base stations, towers and radio transmission equipment.

That sort of oversight model has been accepted by other countries – notably the U.K., where a special laboratory staffed with government intelligence officials reviews all Huawei products.

The Top 3 Priorities for CIOs – InformationWeek

CIOs who want their companies to succeed with digital transformation need to have a long-term approach, focusing on the long-tail benefits.

We’ve already started our way down the steep hill of digitization, and there’s no turning back. The changes we see today will only increase in speed and intensity. At the helm, guiding the organization as it approaches top velocity? The CIO.

In the olden days, CIOs simply executed projects. Now, entire business strategies depend on how well CIOs can navigate their world. So far, almost every major analyst organization has published a report detailing the challenges that CIOs face, and the top companies demonstrate an interesting level of consensus on what these challenges are.

As CIOs look to the future, they should consider putting the following three steps at the top of their to-do lists:

1. Diving into new technologies, not waiting on the sidelines. CIOs should constantly ask themselves what else can be turned into ones and zeroes. The digital takeout is far from complete, and Gartner’s research suggests that digital transformation is a higher priority for CIOs than profit improvement, innovation, R&D, and cost reduction. Yet even among the top-performing CIOs’ organizations, only 50% of their processes are digitized.

Forrester echoes this sentiment, suggesting that as many as 60% of CIOs are behind in their digital transformation efforts. Many CIOs I work with in various industries have encountered a similar issue, but by incorporating citizen developer platforms, they have empowered their workforces and advanced both their digital transformation and automation efforts.

Catching up won’t happen overnight, but CIOs should start by identifying places in their organization that could be improved with tech. Then, they should look at other early adopters to see what strides are being made in that area. CIOs could also consider implementing pilot programs within their organizations to test new solutions and share what they learn from each test.

2. Researching master data management solutions. Data is the gold that a digital transformation allows us to mine. According to Gartner, CIOs are seeing business intelligence and analytics as their most important differentiators. Those at the head of the pack are already reaping the benefits of CRM and ERP systems, but as a whole, they’re moving toward long-tail technology that paints an even fuller picture.

Most analytics projects get stalled because of a lack of clean data. CIOs should implement a master data management solution to help get clean master data entered into a central warehouse. An enterprise service bus will help send transaction data there as well. It’s easy to get bogged down in perfecting data entry, but CIOs must try to maintain realistic tolerances when it comes to cleanliness.

3. Building symbiosis between their digital processes and their teams. CIOs have long turned to software to help them solve problems, but the machine is now just as much a part of the team. Chatbots, robotic process automation, business process management, and machine learning are capable of performing far more work in an hour than a human could in a thousand years. According to Gartner, 25% of CIOs are already planning to implement AI projects, and Forrester predicts that 10% of all purchase decisions will be informed by an intelligence agent.

The first step to building symbiosis is to create a process catalog to streamline who (or what) completes which tasks. CIOs need to keep their catalogs up-to-date so they remain useful, especially once human-centric and system-centric processes have been identified. These two labels will give you insight into which tasks would make good candidates to be handled by AI.

Businesses are continuing to gain new ground in the quest for digitization, meaning CIOs must rapidly realign their priorities to reflect current market conditions. By drawing more of the business into the digital fold, they’ll be better equipped to analyze for solutions, introduce automation and AI, and keep their organizations protected from a growing list of security threats.

Over the next three years, the pressure will mount for CIOs to make these changes, and the gap between those who implement them successfully and those who falter will widen considerably. Ultimately, however, digitization is a journey, not a destination. Those who approach it as such will be at a considerable business advantage in the coming decade.

Suresh Sambandam is the CEO of KiSSFLOW, a disruptive, SaaS-based enterprise-level workflow and business process automation platform enterprises with more than 10,000 customers across 120 countries. He is an expert and renowned entrepreneur on a mission to democratize cutting-edge technologies and help enterprises leverage automation.

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT … View Full Bio

Meet the 30 healthcare leaders under 40 who are using technology to shape the future of medicine – Business Insider

Business Insider

The healthcare industry has no shortage of big ideas.

Whether by eliminating the hassle of visiting a brick-and-mortar pharmacy or by changing the way we store and access personal data, young leaders are working to make healthcare a better experience for everyone.

Drawing from nominations that came in from top healthcare executives, entrepreneurs, and investors, Business Insider has come up with 30 leaders under the age of 40 who are shaping the future of medicine.

Here’s our list of top young leaders engaged in groundbreaking work in healthcare technology, listed alphabetically.

Inside X, the Moonshot Factory Racing to Build the Next Google – WIRED

Any project hoping to qualify as X-worthy must fall in the middle of a three-circle Venn diagram. It must involve solving a huge problem. It must present a radical solution. And it must deploy breakthrough technology.

That definition, which X uses to separate the delivery drones from the invisibility cloaks, didn’t exist in 2010, when X first took shape. The effort started with an experiment: Larry Page asked a Stanford computer science professor, Sebastian Thrun, to build him a self-driving car. At the time, Thrun knew as much about the technology as anyone: He had led Stanford’s winning bid in the 2005 Darpa Grand Challenge, a 132-mile race for fully autonomous vehicles across the Mojave Desert outside Primm, Nevada. When Darpa held another race in 2007, the Urban Challenge, the agency thickened the plot by making the vehicles navigate a mock city, where they had to follow traffic laws, navigate intersections, and park. Stanford came in second (Carnegie Mellon won), and Thrun, who was already doing work with Google, came to the company full-time, helping develop Street View.

The Darpa Challenges had proven that cars could drive themselves, but the feds weren’t holding any more races. American automakers were focused on surviving an economic collapse, not developing tech that could devastate their businesses. Google was a software company, but it had mountains of cash, and it was clear that bringing this idea to market had the potential to save lives, generate fresh revenue streams, and extend Google’s reach into one of the few places where looking at your phone is not cool.

So Thrun quietly hired a team, passing over the established academics who led the field in favor of a younger crew, many of them Darpa Challenge veterans, with less ingrained ideas about what was impossible. (They included Anthony Levandowski, who eventually found himself at the center of a bruising lawsuit with Uber, which the companies settled in February.) Page set his own challenge for the team, selecting 1,000 miles of California roads he wanted the cars to navigate on their own. Thrun’s squad called it the Larry 1,000, and they pulled it off in a conventional-wisdom-busting 18 months.

This move into the physical world was fresh ground for Google, whose taste for projects outside its core business had yielded Gmail, Google Maps, and Google Books—cool stuff, but still software. And the sight of Toyota Priuses chauffeuring themselves around the streets of Mountain View inspired possibilities, including more projects that didn’t consist solely of 0s and 1s.

But self-driving cars had fallen in Google’s lap. Finding other similarly hard, complex, worthwhile problems would require some infrastructure. Page made Thrun the company’s first “director of other,” in charge of doing all the stuff that didn’t line up with what investors expected from Google. Because Thrun was focused on the self-driving team (and after 2012, on his online education startup, Udacity), his codirector, Astro Teller, took the helm of a ship whose purpose and direction remained nebulous.

In an early conversation with Page, Teller tried to hash it out. “I was asking, ‘Are we an incubator?’” Teller says, sitting back in a chair with his trademark rollerskate-clad feet kicked out in front of him. Not exactly. They weren’t a research center, either. They were creating new businesses, but that didn’t convey the right scope.

Finally, Teller reached for an unexpected word. “Are we taking moonshots?” he asked Page. “That’s what you’re doing,” Page replied.

Creating a research division to build groundbreaking products is a mainstay of companies whose worth is tied to their ability to innovate. The tradition goes back at least to Bell Labs, founded in 1925 by AT&T and Western Electric. Made up of many of the smartest scientists in the country, Bell Labs is known for creating the transistor, the building block of modern electronics. It also helped develop the first lasers and, courtesy of mathematician Claude Shannon, launched the field of information theory, which created a mathematical framework for understanding how information is transmitted and processed. Along with eight Nobel prizes and three Turing Awards, the lab produced the Unix operating system and the coding language C++.

This breadth was key to Bell Labs’ success. There was no way to know what the next breakthrough would look like, so there was no point in demanding a detailed plan of action. Its leaders were fine with “an indistinctness about goals,” Jon Gertner writes in The Idea Factory: Bell Labs and the Great Age of American Innovation. “The Bell Labs employees would be investigating anything remotely related to human communications, whether it be conducted through wires or radio or recorded sound or visual images.”

Yet Bell labs functioned within some parameters. Its most valuable tool was basic research: Bell’s scientists spent years probing the fundamentals of chemistry, physics, metallurgy, magnetism, and more in their search for discoveries that could be monetized. And while “human communications” is a broad mandate, their work didn’t venture far outside what could conceivably improve AT&T’s business, which was telephones.

Silicon Valley got its first great innovation lab with Xerox’s Palo Alto Research Center, whose researchers stood out not for their scientific breakthroughs but their ability to take existing technology and adapt it for new aims that had never been considered. PARC created the laser printer and Ethernet in the 1970s and early ’80s and laid the foundation for modern computing by leading the transition from time-shared monsters that fed on punch cards to distributed, interactive machines—aka personal computers.

But in Silicon Valley, it’s best remembered for Xerox’s failure to capitalize on that work. The lab pioneered graphical user interfaces—think icons on a screen manipulated by a mouse—but it took Steve Jobs to bring them to the masses. Xerox’s bosses didn’t pooh-pooh the tech, they just didn’t see how it concerned them, says Henry Chesbrough, who studies corporate innovation at the Haas School of Business at UC Berkeley: “Xerox was looking for things that fit the copier and printer business model.”

By giving its denizens a near-limitless mandate and maybe not quite so limitless funding, X thinks it can create products and services that previous labs might never have discovered—or might have cast aside. It doesn’t do basic research, relying instead on other institutions (mostly governmental and academic) to create tools whose uses it can imagine. It doesn’t rely on having the smartest people in the world within its walls and is happy to scout for promising ideas and lure them inside. And, most important, it’s charged with expanding the scope of Alphabet’s business, not improving what’s already there. For all those Nobel prizes, Bell Labs was valuable to its owners because it made phone calls better and cheaper. Xerox’s shareholders appreciated PARC because it earned them billions of dollars with the laser printer.

X isn’t making these mistakes, because its job isn’t to make search better. It’s to ensure that the mother ship, Alphabet, never has to stop expanding.

In that way, X’s project hasn’t been to pioneer self-driving cars or launch internet-slinging balloons or envision autonomous drones; the real purpose has been to build a division capable of engineering such businesses. Its fetishization of failure and its love for ideas that make everyone look up, even if only to shoot them down, are all in service of this single goal: If you’re not failing constantly and even foolishly, you’re not pushing hard enough.

That’s great for Alphabet and for people who like the idea of self-driving cars (especially those who can’t drive) or tracking their health with non-invasive wearables or basking in the light of the internet in the dark corners of the world or getting their cheeseburgers and toothpaste without contributing to traffic and planet-choking emissions.

But Alphabet, through Google, already has tremendous influence over our lives: how we talk to each other, where we get our news, when we leave the house to beat traffic. For most people, it’s a worthy tradeoff for free email, detailed maps, and free access to nearly unlimited information. X seeks to multiply that influence by moving it beyond the virtual realm. Critics already call Google a monopoly. Now imagine its dominion extending into our cars, into the food we eat and the goods we order, into our physical well-being—into how we connect to the internet at all. Google today wields heavy influence over the parts of our lives embedded in our phones. Are we ready to let it in everywhere else?

Analysis: Many States Continue to Have Restrictive Telemedicine Policies – Healthcare Informatics

Healthcare organizations report high satisfaction with their telehealth virtual care platforms (VCPs), however there are significant differences in how broad the various platforms are and in the quality of the vendors’ service. What’s more, integration with electronic health record (EHR) systems is a key challenge facing every telehealth vendor, according to a KLAS report.

In its report, “Telehealth Virtual Care Platforms 2019: Which Telehealth Vendors Have the Scalability Customers Need?,” KLAS evaluates some of the top telehealth companies including American Well, MDLive and Epic, and analyzes what capabilities will set vendors apart as more healthcare organizations adopt virtual health technology solutions.

Most virtual care platform vendors receive positive performance ratings, but the depth and breadth of their capabilities vary, and this can impact scalability for organizations looking to grow, according to KLAS. No two vendors are alike in their capabilities, offering different combinations of functionality and experience.

Of the companies KLAS evaluated, the most common type of visit varied—most of American Well’s visits were on-demand urgent care, while the majority of Epic’s visits were associated with virtual clinic visits.

A key factor of scalability is the ability to support multiple visit types, KLAS researchers note. While multiple vendors offer support for all three visit types (on-demand or urgent care, virtual clinic visits and telespecialty consultations) no single vendor has a large proportion of customers using all three (only 12 respondents across all vendors said they were doing so).

American Well, a market share and mindshare leader, and MDLIVE, two of the vendors used most frequently for multiple visit types, receive generally positive—but lower than average—performance scores. Vendors more specialized in specific visit types or component layers (e.g., Vidyo and Zipnosis) have high scores but narrower expectations from customers.

No one vendor meets all needs equally well, but several are reaching for “all-purpose” status with internal development and/or recent acquisitions (American Well acquired Avizia; InTouch acquired TruClinic), according to the report.

KLAS’ analysis also uncovered a general trend of poor integration. In most cases, the addition of a virtual care platform also means the introduction of a second EHR into the clinician workflow.

“Although integration between EMRs is generally understood to be important for care quality, patient safety, efficiency, and productivity, few interviewed VCP customers have full bidirectional transfer in place. Most say that they are too early in their virtual care programs to pursue integration or that it simply costs too much,” KLAS researchers wrote.

Only American Well, Epic, and MDLIVE have more than half of interviewed customers currently on an integrated path, KLAS found. Epic has placed virtual care capabilities directly into their top-rated MyChart patient portal, which many patients already use. Epic integration means clinicians are able to stay within their existing workflow environment as well.

Many provider organizations are in the early phases of their virtual care programs where showing an ROI is an important milestone and one that organizations want to achieve as soon as possible, KLAS notes. “A key promise from vendors is that their technology and accumulated expertise will result in a fast start and continuous acceleration. When this comes at significant cost or progress is slower than expected, provider organizations can experience disappointment,” the KLAS researchers wrote.

When it comes to getting their money’s worth and achieving desired outcomes, Epic and InTouch are rated highest among fully rated vendors, and swyMed and Vidyo perform well among their smaller groups of respondents, KLAS researchers note.

“For each vendor, the current value proposition is somewhat narrow but well understood: Epic’s use is limited to existing patients of Epic EMR customers; InTouch is used primarily for consults; swyMed is used by respondents primarily for mobile, first responder needs; Vidyo delivers video-conferencing tools,

which are typically combined with other VCP solutions. SnapMD is seen as a low-cost option, but some customers say the impact has been limited. Commentary from VSee customers suggests a similar experience,” KLAS researchers wrote in the report.

Many healthcare organizations are early on in their virtual care journeys, and their ability to achieve desired results depends on guidance from vendors. According to KLAS’ analysis, swyMed and InTouch receive the most praise for taking initiative in proactively guiding customers and also in quickly responding to support problems.

While respondents praise American Well’s platform scalability, some customers blame the vendor’s “exponentialgrowth for staffing shortages that have led to implementation holdups and backlogged service requests. Some SnapMD customers say hard-to-beat pricing comes with a support model that is spare in terms of providing tailored guidance, according to the KLAS report.

Most vendors offer two additional options that can help accelerate customers’ expansion and growth—supplemental services, including added-cost advisory and outsourced services, and tools that automate patient-facing tasks that traditionally require additional staff. I

KLAS found that few customers mentioned these options in top-of-mind conversations. “Respondents who spoke of their vendor’s supplemental services most often referred to marketing support or strategic planning services from vendors American Well, MDLIVE, or Zipnosis. Those who referred to task automation report patient-self-service capabilities around check-in, scheduling, surveys, and/or patient flow from InTouch Health (TruClinic), Epic, MDLIVE, or Zipnosis,” the KLAS researchers wrote.

EU’s Digital Beating Heart – Data Economy

In digital, borders are more blurred than in the physical world. In Luxembourg, the nation’s digital economy is open and ready to help others at government and enterprise level bulk up on their digital strategies through a wide range of initiatives that have seen the country move up the hosting ranks in recent years.

In 2006 the Luxembourg Government identified the expansion of the country’s data centre fleet as an objective to develop ICT services in the Grand Duchy.

This article originally appeared in the Data Economy magazine. Click the above image to read more.

A year later, the cabinet set up the backbone for what it is today an established data centre hub within the Luxembourgish border.

It was on that day that LuxConnect, a Luxembourgish collocation player backed by the government, was launched.

Having gone from the steel industry to the financial and satellite business roadmaps in the 1970s, Luxembourg is today one of the most attractive markets in Europe for digital.

According to consulting company BroadGroup’s “Datacentres Luxembourg” market report, Luxembourg accounts to over 600,000 sqf of data centre raised floor space and up to 85.4 MW of data centre customer power (DCCP) in total available as of the end of 2016 – with 19 third party facilities, some offering wholesale.

LuxConnect is one of the major players in the market, however, other companies like Data4, Datacenter Luxembourg, Sungard SA, Verizon Luxembourg, ebrc, Etix Everywhere, and others also represent a large share of the market.

Some of the world’s leading tech brands, including PayPal, Amazon, Microsoft, eBay, and Skype have all established their European HQ in Luxembourg, and recent reports of Google planning the construction of a $1bn data centre in the country have sparked interest in Luxembourg to a new level, making this one of the EU’s digital beating hearts.

In the next pages, Luxembourgish data leaders open up on the booming national hosting sector and what’s next for the country.

Antoine Boniface, CEO, Luxembourg-based Etix Everywhere talks from financial to digital gateway, how data centres changed Luxembourg.

Luxembourg has for years been seen as a financial destination, how are data centres challenging that view?

Luxembourg has for years been seen as a financial destination, but the Luxembourgish government also strongly encouraged the development of the data centre industry over the last 10 years.

As a result, the country is recognised for its data centre expertise and for its quality of infrastructure.

Luxembourg hosts 40% of Europe’s Tier IV data centres, according to Luxembourg for Finance.

Luxembourg is the second largest investment hub in the world.

Today, financial institutions are increasingly turning to IoT, Crypto-Currency, Blockchain and HPC, and will require more and more IT capacity.

They will need a fast and secure access to their data, so they will tend to bring their data closer to them.

The demand for edge data centres is going to boom in Luxembourg.

Furthermore, due to Brexit, companies are looking for data centres within the European Union.

Luxembourg benefits from its central location in Europe with a direct access to 500 million of potential consumers.

How has investment within the Luxembourgish data centre market changed in recent years and what is next for Luxembourg?

Luxembourg is becoming a strategic destination for the data centre industry attracting more big players.

The country is recognised as one of the “Top 10” best locations where to invest in data centres.

ICT has a growing role in the Luxembourgish economy, actively supported by the government.

Since 2014, Luxembourg has implemented ‘Digital Lëtzeburg’ initiative aiming to strengthen ICT industry, to support the digital transformation of the country and to become a substantial player in the data market.

Research in IT has been stated as national priority and the authorities offer attractive conditions for R&D investment.

What is Etix preparing for the near future?

Etix Group has recently announced a major capital increase from SBI Crypto Investment, a wholly-owned subsidiary of SBI Holdings.

We will use the proceeds to accelerate the expansion of our global network of colocation data centres.

We already have infrastructures in France, Belgium and Morocco, and are currently building new data centres in Brazil, Ghana and Sweden.

We also have more than 10 projects ready to be built that will now progress more rapidly.

Our ambition is to support our customers in their global expansion by providing them IT capacity wherever they need it.

Roger Lampach, CEO, LuxConnect

How are data centres helping to re-shape Luxembourg’s image outside Luxembourg?

Luxembourg is actively promoting itself as a technology hub in the centre of Europe, located as it is between all of the major capitals.

We have many conferences and exhibitions to assist in this regard (such as ICT Spring).

Our Government is very proactive supporting the ICT sector and Digital Luxembourg, the name behind Luxembourg’s digitalization movement, get promising projects off the ground.

Our data centres are ‘state of the art’ to set a benchmark as to how seriously we take the sector.

All our data centres are supplied with green and low-price electricity.

So we also invest significantly in ‘green’ technology’ such as the unique biomass KioWatt plant cooling our data centre in Bissen.

What’s next for Luxembourg?

Luxembourg has many initiatives in the Fintech and Technology sectors in order to be considered as one of the main players.

The Luxembourg government has always had the ability to consider radical sectors before they were mainstream, we were the first satellite broadcaster and now we are looking at space mining.

Luxembourg, the Digital Nation, started building dark fibre infrastructure and data centres over 10 years ago to attract international companies requiring high specification server space and great connectivity.

What is LuxConnect preparing for the near future?

We will continue to promote our unique ‘multi -tier’ offering and very recently DC1.

3 has been certified Tier III in addition of the Uptime certified Tier ll and lV in the same facility.

Insurance companies and others are setting up offices in Luxembourg (because of Brexit uncertainty) so we will assist them with server hosting via our Partners.

We are encouraging UK based Managed Service Providers and colo operators to have a foothold in the EU using our data centres and US colo operators with no presence in Europe to do the same.

Frankfurt is on our doorstep (3.5ms rt) so we feel we can attract Disaster Recovery solutions from the operators there or their clients.

As our power costs are significantly lower than Germany we are looking for ‘overspill’ from Frankfurt from the automotive sector, cloud providers and others there and in the rest of Germany.

The new GDPR regulations might help in that regard.

Projects such as “Open Compute Project” and HPC are in the scope of LuxConnect for the near future.

Nicolas Mackel, CEO, Luxembourg for Finance

How is technology and especially the gov’ts drive to install data centres, helping to contribute to the shift in perception that Luxembourg is much more than just a financial destination?

The shift to the digital era is a major priority for Luxembourg.

The country has positioned itself as a leading digital hub for a range of industries such as Cleantech, Logistics, Automotive, ICT, Space and Finance with worldclass infrastructure including as datacentres, where the country can offer the largest density of Tier IV data centres in Europe.

As an example, the government of Estonia chose Luxembourg to open its e-embassy, the first of its kind worldwide, hosting Estonia’s most critical and confidential data.

Other international institutions such as NATO, the European Patent Office, as well as numerous international financial institutions trust Luxembourg to store their critical data.

Being at the centre of the “Golden Internet Ring”, Luxembourg is connected with major European cities and financial centres through 28 fibre routes with ultra-low latency and an average round-trip times of c. 5 milliseconds.

The country ranks 1st for technological readiness worldwide (Global Competitiveness Report, The World Economic Forum 2015); 1st globally for international bandwidth and laws relating to ICTs, 2nd for intellectual property protection (2016 Network Readiness Index of World Economic Forum); 5th in Europe in the Digital Economy and Society Index 2018 as one of the leading countries for connectivity, digital skills and Internet usage.

These are key advantages for the positioning of the financial centre in the digital era.

Beyond infrastructure, Luxembourg focuses on R&D capabilities with renown and industry focused research institutions such as the Luxembourg Institute of Science and Technology (LIST) and University of Luxembourg’s interdisciplinary centre for Security, Reliability and Trust (SnT).

The latter leads research projects with international financial players such as PayPal, Clearstream, BGL BNP Paribas and the Luxembourg Stock Exchange.

This underlines the link between public research, ICT infrastructure and the financial centre in Luxembourg.

What in your opinion will be the main drivers for Luxembourg in the coming years?

Luxembourg is aiming to establish itself as a leader in the digitalisation of key areas.

Fintech is one of them.

Luxembourg offers ideal conditions for FinTech companies to develop their services and products and expand their business to reach a European customer base.

On top of this, Luxembourg’s international financial centre provides a significant local market for FinTechs to launch new products in a secure environment.

Luxembourg’s open and responsive approach to regulating FinTechs under European passport provisions means that these innovative new companies can conduct business on an EU-wide level from a single base.

At present, leading companies, such as Rakuten, Ebay, Amazon, Daimler or PayPal are serving the European Union from Luxembourg.

Several projects have been initiated to further digitalise such as FundsDLT and Fundchain, two distributed ledger solutions for asset management, an industry in which Luxembourg ranks globally #2.

The European High-Performance Computing (HPC) project more infrastructure-driven project, where Luxembourg acts as headquarter.

As part of the European Commission’s Horizon 2020 program, the HPC project will be a strategic resource for Europe’s future as it allows researchers to study and understand complex phenomena while allowing policy makers to make better decisions and enabling industry to innovate in products and services, in areas such as fintech.

Canada Invests in Clean Technologies to Reduce Methane Emissions – SYS-CON Media (press release)

CALGARY, July 10, 2018 /CNW/ – As Canada transitions to a low-carbon future, investments in clean technology will support jobs for the middle class and improve clean air by reducing pollution.

Kim Rudd, Parliamentary Secretary to the Honourable Jim Carr, Canada’s Minister of Natural Resources, announced today an investment of $2.2 million for two projects aimed at tackling methane emissions in the oil and gas sector to help Canada achieve its climate change goals.

With an investment of $1.6 million, the Government of Canada is collaborating with Clearstone Engineering Ltd. to support research into technologies and practices that detect, quantify and reduce volatile organic compounds (VOC) and methane emissions. Canada is also contributing $600,000 to Patro Research Ltd. for research into finding the most cost-effective, high-impact methane emissions-reduction opportunities.

Both projects are being funded through Natural Resources Canada’s Energy Innovation Program, which supports initiatives that accelerate clean technology development. With methane emissions posing a real environmental challenge, these investments will help Canada achieve its climate change goals and improve the environmental performance of the oil and gas sector.

Through Canada’s national energy dialogue, Generation Energy, Canadians expressed that Canada has an opportunity to be a leader in the transition to a clean growth economy. We will continue to support clean tech initiatives that create jobs for the middle class, support Canadian industry competitiveness, clean our air and act on climate change.


“Investing in new, innovative technologies and practices to help tackle methane emissions demonstrates Canada’s commitment to building a clean growth economy that integrates clean technologies in the oil and gas sector, creates good, middle-class jobs for Canadians and will help us achieve our climate change goals.”

Kim Rudd

Parliamentary Secretary of Natural Resources Canada

“Clearstone is honoured to be working on this important study aimed at exposing China to Canadian technologies for managing VOC and methane emissions in the oil and gas sector. Clearstone has been working in China’s oil and gas industry for over a decade and has had the opportunity to engage with a broad range of relevant Chinese government and industry stakeholders. We see significant opportunities in China and believe that this project is a great opportunity to showcase Canadian innovation and expertise. The outcomes of this project will be beneficial to both countries.”

David Picard, President

Clearstone Engineering Ltd.

“Through this initiative, Patro Research is developing methane mitigation options for the integrated oil and gas pathways through which natural gas is produced and delivered to end users. Our research and development activity is aimed at improving access to cleaner energy products and technologies not only in Canada but also in developing countries.”

Craig Fairbridge, Managing Director

Patro Research Ltd.

Related Information



Follow us on Twitter: @NRCan (http://twitter.com/nrcan)

SOURCE Natural Resources Canada