The quantum computing race the US can’t afford to lose – TNW

Quantum computing has ushered in a new area of information technology. An international arms race to develop quantum computers has steadily grown more competitive and more critical.

China reached the early pole position by unveiling the world’s first quantum communication landline connecting Beijing with Shanghai like no two other cities in history. The first quantum encrypted Skype call was also made, that same day, by the Chinese. It was only possible because of the world’s first quantum satellite, known as Micius.

It’s clear that quantum technology promises to usher in a new era of computing. And other countries are already staking their claim, vying to be the nation that ultimately emerges as the world leader.

Cybersecurity in the crosshairs

Beyond its image as a booster for communications, quantum computing also poses a very real threat to data protection with its proven ability to quickly crack most codes.

Only the lack of large scale quantum computers is holding back the ability to shred today’s encryption. And both criminals and nation-states are capturing as much encrypted data as they can now, with the expectation that quantum computers will eventually be able to crack current protections.

China and other nations are investing heavily in research and development for quantum computers as well as technology that could, theoretically, prevent hacking by quantum supercomputers. If the United States fails to develop a similarly strong quantum infrastructure, all of today’s protected data could be at risk.

This includes military data that would directly impact operational security (OPSEC), which is the critical communications in any military mission.

While OPSEC is one major potential vulnerability, other systems could be targeted. The financial and medical sectors come to mind. Both industries play pivotal roles in American life and have access to important data.

A sufficiently advanced quantum computer could theoretically decrypt and break into a mass of bank accounts or patient records in very little time.

Quantum technology requires investment

Spending on technology across the board is projected to grow over the next few years as computing advances. The United States Department of Defense has requisitioned $899 million for computer science research. While this research focuses largely on quantum computing, the requested amount is only .000046% of the total gross domestic product (GDP).

Meanwhile, China is investing much more heavily in quantum computing. While their exact government spending is unknown, a new research laboratory costing approximately $10 billion was recently built in China for the express purpose of researching quantum technology.

The total amount being spent by the Chinese government dwarfs the investment by the United States, and that deficit does not appear to close over the next five to ten years.

In order to keep a secure infrastructure, the United States must prioritize the digital space. The digital theater is likely the next major area of operations as countries try to grab sensitive information.

A situation like this was mentioned in Tom Clancy’s excellently researchedThreat Vector. In the book, the Chinese use superior technology to disrupt American businesses and pilfer sensitive documents. It’s not unlike what could very well be happening right now in anticipation of quantum computing advances.

The dangers of second place

While Threat Vector is fiction, there are some harsh realities facing the United States should it fail to remain competitive in this critical area. Beyond the obvious theft of sensitive data and mission critical secrecy is the loss of jobs or potential jobs as quantum computing is developed and designed offshore.

For the United States to remain at the cutting edge, it will need to create it’s own quantum network to allow for unbreakable lines of secure communication, like what is happening in China.

We are also in vital need of quantum-proof encryption such as Quantum Key Distribution (QKD) that can be applied as soon as possible. Our most critical data needs to be safe from future quantum computers and their expected ability to more easily crack today’s encryption.

American companies like Microsoft, Intel, Google, IBM,and others are conducting research and development into quantum technologies, but will likely require assistance from the government. After all, government backing has been at the root of most technical marvels of everyday life such as microchips, GPS, touch screens, Google’s search engine and the Internet.

The biggest competitor within quantum computing is China, which is likely the world’s frontrunner, but there are others. Russia is also pushing the boundaries. Spearheaded by the Russian Quantum Center, Russia announced a breakthrough by designing a quantum computer that can reliably solve basic computations faster than anything else today.

Even North Korea has stated that they intend to develop quantum computers. While it’s unknown how much North Korea has invested in this program, the fact that they are tossing their hat in the ring is troubling.

The United States can’t afford to come in second in the global quantum arms race, especially to any country that has been adversarial or downright antagonistic in the past.

In a quantum world, the speeds are so fast and the numbers so large, that second place really doesn’t mean very much. There is the leader in quantum computing, and then there is everyone else.

The United States has an incredible ability to compete on the world stage in anything. The effort just needs the proper investment, manpower and directive. Quantum computing is a race where we can compete, and one that we absolutely must win.

SiW rises from the ashes of Stamford Tech Week – Westfair Online

Calling All Drones and Chatbots – Transmission and Distribution World

State briefs for Sept. 1 – NewsOK.com

Conference speakers will include Gene Nora Jessen, original member of the Mercury 13 testing program for female astronauts; Michelle Millben, former White House adviser; and Carolyn Rodz, founder of Alice, an AI-based business accelerator in partnership with Dell. Panel discussions will focus on the impacts of women on scientific fields, policy and society, and startups around the state and will feature prominent tech founders, biotechnology experts, social entrepreneurs, meteorologists and political activists.

A pre-conference reception will be from 6 to 8 p.m. Sept. 13 at Science Museum Oklahoma. Registration is $30, or free for teachers, students or university faculty with valid identification. Go to www.okcatalyst.com/okwise for more information.

FROM STAFF REPORTS

OKLAHOMA CITY

Privacy in the Balance – Government Technology

Seattle has a homeless problem and it’s getting worse. Since 2007, the city’s homeless population has risen 47 percent, according to the Seattle Times. Today, the city has more than 10,000 residents who don’t have permanent shelter, putting Seattle and King County near the top of the list for urban concentrations of homelessness. Not surprisingly, the situation has put pressure on the city to deliver help in the form of food and shelter, along with addiction and mental health services, while keeping an eye on crime and health problems at the many encampments that have taken root in some neighborhoods.

But to do that calls for using lots of data, some of which may be personal. The city wants to help its homeless population in a coordinated and effective way, which may also mean sharing data between agencies. How that can be done without impacting the privacy of individuals is a balancing act, one that Chief Privacy Officer Ginger Armbruster finds herself doing on a daily basis. “We need data to make sure we are meeting our goals, because we don’t have a lot of time. These people are in a crisis,” she said, regarding the urgency of the problem. Yet it takes time to ensure privacy.

Seattle has a history of putting privacy at the forefront of its policies, which can add complexity to a discussion on how best to deliver services to those who need them the most. “Privacy has strong support in Seattle,” said Armbruster. “It’s about collecting only the data we need, managing it, getting consent and giving users some control over its accuracy.” How Seattle balances its data needs and the growing clout of technology with privacy concerns is an issue for cities nationwide. The solutions aren’t simple, but some best practices are beginning to emerge.

A coordinated approach to homeless service delivery requires sharing data between agencies without jeopardizing individual privacy. (Photo: Shutterstock.com)


More Technology, Less Privacy?

Homelessness isn’t the only issue Seattle is trying to tackle with data. The city wants to better serve its immigrant population. Then there’s the growth in smart city services, particularly around transportation. For urban areas in other parts of the country where crime is a problem, data in the form of surveillance cameras and videos is in demand from law enforcement agencies. Altogether, cities spent nearly $31 billion on IT in 2017, much of it going toward smart city efforts, the Internet of Things, open data and civic engagement, according to the Center for Digital Government.*

To manage all this data, cities increasingly rely on vendors who can host the services and store the data rather than build expensive data centers themselves. The trend has given cities opportunities to govern in new and better ways, as well as to roll out services that weren’t possible just a few years ago. Cities of all sizes can help drivers respond more quickly to traffic congestion problems, predict where the next crime hot spot will occur, track pollution problems and give citizens the kind of engagement that builds trust.

But some of the technologies that make all this possible collect data that worries privacy groups. The American Civil Liberties Union has been particularly vocal about the inherent privacy risks that today’s high-tech tools can trigger. That doesn’t surprise Peter Swire, who is a leading privacy and cyberlaw scholar, and currently a law professor at the Georgia Institute of Technology. “For smart cities, a huge range of applications involve personal data,” he said.

Cities are increasing their dependence on online services and cloud storage, which is cause for concern, according to Swire, who was the country’s first chief counselor for privacy in the U.S. Office of Management and Budget during President Bill Clinton’s administration. At the same time, cities are ramping up the number of applications that involve personal data. “You’ve got license plate information, body-worn cameras, facial recognition technology,” he said. “Cities are also proposing to build applications that use sensors, which can collect identifiable information.”

In another trend that worries privacy advocates, cities are allowing more third-party firms to provide services, such as e-scooters and bikes, as well as public Wi-Fi, some of which are advertised as free, but often require a person to download an app to their phone, which can identify the person’s location or capture other forms of personal information, in return for use of the service.

While it may sound like a service, the company’s business model could have more to do with collecting information about people than with the service itself, according to Armbruster. “Cities need to make principled decisions about the kind of data the company collects, how it is handled,” she said. “It’s our responsibility to our citizens to know what data these firms are collecting and we have to make a smart decision on whether to allow others to collect it.”

Two Game-Changing Privacy Laws

While privacy concerns in local government have been growing, two recent events have thrust the issue to the forefront. In May, the European Union began enforcing the privacy rule known as the General Data Protection Regulation or GDPR, which gives EU citizens control over their personally identifiable information.

Few local governments expect a significant, direct impact from GDPR, but the regulation has raised public (and internal government) awareness about personal privacy. However, GDPR does affect private online service firms, which have had to apply much more strict privacy guidelines to their operations than they have in the past. “It means that companies are learning to do privacy impact assessments and provide other protections required by GDPR,” said Swire. That’s going to raise expectations among citizens to receive the same level of privacy protections from local governments as they now receive from private online services.

In June, California passed a major privacy bill that allows consumers to ask companies what information they are collecting on them, why it was collected and which third party has received it; and they can demand that the information be deleted and not sold. Companies that have collected the information can charge a fee from users who opt out of sharing their data to collect any lost revenue, as long as it’s reasonably related to the value provided by the consumer’s data. With California taking the lead on strengthening privacy protection, other states are likely to follow, say experts.

Santa Clara County Chief Privacy Officer Mike Shapiro hopes to capitalize on his Silicon Valley location and start a center of excellence focused on privacy.


Enter the Local Chief Privacy Officer

Regardless of what happens at the state level, cities and urban counties are beginning to take steps to protect privacy at the local level. Seattle was the first city to hire a chief privacy officer (Armbruster is the second person to hold that position). In April, New York City Mayor Bill de Blasio appointed Laura Negrón as the city’s first chief privacy officer. She has been tasked with working across city agencies to promote new citywide protocols around the collection, disclosure and retention of personally identifiable information, as well as to centralize how policies and procedures regarding privacy are to be handled.

Few other local jurisdictions have hired CPOs so far, but in 2017 Santa Clara County, Calif., appointed Mike Shapiro as its first privacy officer, and one of the first to work for a county. Shapiro has an extensive background working on privacy issues in the private sector and consulting with federal and state agencies. The big issue facing local government, according to Shapiro, is the development of privacy policies that are consistent across a government at a time of rapid growth in data-driven projects.

“The challenge is how to take the large amounts of information we collect for constituents and serve them better while also protecting privacy rights and following the law,” he said.

Given Santa Clara County’s location in the heart of Silicon Valley, Shapiro believes the county can play a lead role in fashioning privacy policies and best practices that draw on the strengths of local high-tech firms, academia and government. He hopes to start a privacy center of excellence that will foster the kind of dialog that can balance privacy with digital commerce and good governance.

But Shapiro’s more immediate mission is to create privacy best practices within county government that balance the need to share information with the need to protect it. The county is in the early stages of developing big data sharing projects, so now is the time to build privacy into project management and work processes, not afterward.

To get the ball rolling, he has launched an awareness campaign to educate staff on the different kinds of privacy risks and then promote best practices. Part of the effort is understanding how departments perceive privacy, as well as learning what they do with the data they collect, how the data is shared and when it isn’t, why not. Sometimes an agency’s desire to protect privacy can thwart projects that can serve people, Shapiro explained. Having the right conversation with the right people can overcome roadblocks to data sharing that don’t compromise privacy rights.

In addition to training to raise awareness, governments like Santa Clara County and Seattle are following the lead of private companies and have begun to conduct privacy impact assessments on new projects. Impact assessments are required for federal IT systems, according to Swire. “The key is to have someone with privacy expertise examine important systems before they are deployed,” he said. “That would be a good practice for local governments.”

Swire also advises local governments to have standard contract clauses for IT procurements that provide privacy requirements. He cites California’s new law as a reason why local governments need to be more careful when it comes time for IT acquisitions, especially those that involve vendor access to data. “Cities should think carefully about it before they agree to let vendors sell citizen data,” he said.

Up the Pacific Coast in Seattle, Armbruster’s role as the city’s CPO has taken on greater significance. The fact that the city council passed a resolution that “privacy is a human right” is an indication of just how important privacy has become. She runs an office of four, which operates out of the city’s Department of Information Technology, and functions citywide, overseeing and managing privacy policies and procedures.

“From the beginning, it’s about education and bringing people along the journey to understand privacy,” said Armbruster. Her office has set up a network of privacy champions in every one of the city’s 33 departments. The champions attend regular meetings on privacy, act as a resource on the topic, and some are going through a certification program run by the International Association of Privacy Professionals. Finally, all city staff must participate in standard security and privacy training on an annual basis.

While some workers might grumble about the training process, Armbruster says it’s crucial to making privacy part of how workers think about information on a daily basis. “You have to build the awareness of privacy or it doesn’t make sense,” she said. “We do that by making the need for privacy relevant to individuals, so they are aware of the impact when privacy gets lost.”

Armbruster and other privacy experts emphasize the importance of making privacy an integral part of the process when it comes to program development and IT deployment. Having a review system that tries to catch privacy issues at the end of the process is a recipe for disaster. Instead, Seattle, Santa Clara County and a few other jurisdictions are learning how to build in privacy by design. “This is a very well-known concept, in which you build privacy into the organic process of building systems,” said Armbruster.

When it comes to technology itself, Armbruster keeps an eye on cloud services, although she feels that cloud providers are getting better at providing a service that builds in sound data protections. She also worries about shadow IT — those so-called “free” apps and storage services, such as DropBox, which employees will turn to because they are familiar with them outside of work. “People have to understand that free is not free,” she said. “In our position, it’s not ‘your’ data that is sitting in some third-party cloud storage service, it’s citizens’ data or the city’s data.”

Finding That Balance

As more local governments develop and launch smart city projects, it’s becoming increasingly clear that conversations and strategies around privacy need to start happening sooner rather than later. While today’s game-changing projects often involve sensors that collect data that may not identify individuals, too often cities are offered an on-ramp to smart city innovation from a third party that has data collection about individuals at the heart of its business plan.

Knowing an individual’s location has proven to be a gold mine for companies that market products and services. This year, marketing firms are expected to spend $20.7 billion on geo-targeted mobile ads and $32.4 billion by 2021, according to BIA Advisory Services.

In Seattle, Armbruster says companies approach the city regularly about a new service they would like to offer for free, but when questions are asked, it is soon clear that what they want is information about people “to feed that big marketing cloud in the sky,” she said. “Lots of ‘free’ apps aren’t free because they are collecting data about the individuals who use them.”

Local governments need to have serious conversations with vendors when it comes to smart city projects. Rather than say no and kill the project over privacy concerns, Armbruster advises city officials to talk with the department that might want to roll out the service in conjunction with a vendor and see whether the data it collects could be useful at the block level or census level, rather than at the individual level.

What it comes down to, according to Swire, who has studied the impact of technology on privacy for decades, is “that every smart city project needs a smart privacy plan as well.”

Correction: An earlier version of this article incorrectly identified the university where Peter Swire works. It is the Georgia Institute of Technology, not Georgia Tech University.

*The Center for Digital Government is part of e.Republic, Government Technology‘s parent company.

Tod NewcombeSenior Editor


With more than 20 years of experience covering state and local government, Tod previously was the editor of Public CIO, e.Republic’s award-winning publication for information technology executives in the public sector. He is now a senior editor for Government Technology.

South Texas College Celebrates Its First 25 Years – Valley Town Crier

A look back at why the college was created & how it continues to deliver on its promise

As a young teen, current STC Trustee Rose Benavidez recalls traveling with her father, the late Manuel Benavidez, across Starr County looking for support for what was then known as South Texas Community College (STCC) from the community.

Her father had just been appointed as a trustee for the college, and in the summer of 1995, public acceptance for STCC had snowballed, capitulating in an election that saw voters approve three propositions that were essential for the college.

“When I was a kid driving around with my father I remember they were having classes in laundromats or they were allowed by school districts to have classes in old libraries,” Benavidez said. “I can recall my very first experience with the college was when I was maybe 15 years old. My father had just been appointed as a trustee at the college and they were going out for their first bond.

“Funny enough, the first bond that passed was when my father was the chairperson of the Board of Trustees, and this last bond (2013) occurred when I was the chairperson of the Board,” Benavidez said. “It was pretty amazing to see people come out and support that initiative, but far beyond the brick and mortar is the impact and the change in the people of our community and the whole Valley for that matter.”

The creation of STC sprung from the compelling need to improve access to higher education in Hidalgo and Starr Counties.

In the spring of 1993, as legislation was being drafted to create what was then known as South Texas Community College (STCC), the college began distributing flyers for the new institution urging students to “test their wings” and apply as early as possible.

In June 1993, then Texas Gov. Ann Richards signed legislation creating South Texas Community College. At the time, STCC was the 50th community college in Texas and the first in the Upper Valley. The college was created by the Texas Legislature, converting the former Texas State Technical College campus in McAllen into a locally-governed community college serving Hidalgo and Starr Counties.

“I had the passion, the commitment, and the tenacity to say ‘by God, we’re going to do this,” said STC President Dr. Shirley A Reed. “That passion and commitment haven’t changed. When you look at the level of poverty in the Valley, and the number of individuals who haven’t even had an opportunity to finish high school, much less go to college, and then you think about the quality of their lives, it becomes clear there is no end to the work that needs to be done in the Valley.”

South Texas College was created on Sept. 1, 1993, by Texas Senate Bill 251 to serve Hidalgo and Starr Counties. Gov. Richards signed legislation creating South Texas College and was the only community college in Texas to have been established by the Texas Legislature because of the compelling need to improve access to higher education in Hidalgo and Starr.

STCC opened its doors that September and classes began with 1,058 students. The McAllen Memorial High School band provided music for the opening for the college.

Since that time, STC has seen steady growth in its student enrollment starting with 1,058 students in 1993 to more than 34,000 students by fall 2017.

“We have seen a tremendous growth,” said Trustee Dr. Alejo Salinas, who joined the college on the Board in 1996. “We have grown so much. It is incredible. I can remember when I started with the college we maybe had 1,000 students now we are over 30,000. That speaks for the college itself.

“To see friends, family and ex-students come through our programs and to see them graduate has been a very satisfying experience,” Dr. Salinas said. “To hear the feedback from those who have come here, and how full of pride they are with the education they have received, that’s a very rewarding experience for me. It provides me with plenty of reason for wanting to be a part of this college.”

According to the Texas Workforce Commission, since the College’s creation in 1993, unemployment in Hidalgo County has reduced from 24.1 percent to 11.3 percent and from 40.3 percent to 15.6 percent in Starr County.

“From what we have seen, and just the impact on the education on the sheer number of students who are here, it translates not only to the numbers we have but also the number of families we have impacted because of students receiving their certificate or degree,” said former Trustee Graciela Farias. “Having the whole region benefit in such a positive manner because of what South Texas College has been able to do will continue for all of our students. The sky is the limit for South Texas College.”

STC offers more than 120 degree and certificate program options including associate degrees in art, science, technology and allied health fields. The college also offers 18 online associate degrees and certificate options through South Texas College Online enabling students to earn their degrees without even setting foot on campus.

Today, the college ranks second in the nation for total enrollment of Hispanics among two and four-year schools in 2015, according to Hispanic Outlook (HO) on Education Magazine. The magazine published its “Top 100 Colleges and Universities for Hispanics” with data collected from the 2015 school year. In that issue, numerous programs at STC placed among the top-10 in schools across the nation for degrees awarded to Hispanics.

STC is also one of only three community colleges in Texas accredited to offer applied baccalaureate degrees. A Bachelor of Applied Technology (BAT) degree in Technology Management, Computer and Information Technologies, Medical and Health Services Management and a degree in Organizational Leadership may all be attained at STC.

“It has been tremendously gratifying to see the outstanding workforce development programs support local industry, and our local workforce in being better trained in areas like manufacturing and technologically advanced fields,” said Trustee Paul Rodriguez. “Employees are reaching higher pay scales and employers are providing better jobs. I believe the future will see more college degrees, more critical partnerships with other colleges and universities and a center for manufacturing and industrial development for the entire border region.”

Delta State Professor Talbot Brooks is a Katrina hero – Jackson Clarion Ledger

Good Mornin’! Good Mornin’!

On August 29, 2005, Hurricane Katrina – the largest ever natural disaster to hit the United States – made landfall on the Mississippi Gulf Coast. Shattering a state, scattering its residents and flooding everything in sight. Katrina ravaged everything in its path from Florida to Texas and the Magnolia State got ignored by the national media and weather channel. But the storm still hit, still took lives and changed many things forever. Recovery and rescue efforts were hindered as landmarks and roadway signs and the things that made anything easy to find and familiar – were all gone. Just how do you give somebody directions to something that used to exist but doesn’t exist and the folks there still need help anyway?

Hurricane Katrina eye wall from Hurricane Hunters aircraft.
National Weather Service

That’s the question that echoed as did many others in the early efforts. Then along came Talbot Brooks – a lifelong volunteer fireman. He had only moved to Mississippi in January of 2005 after working at Arizona State University. In Cleveland at Delta State University, Brooks works as a university professor teaching things that frankly, I still don’t understand after our one hour visit earlier this summer. His bio on the DSU website includes the description, “has served as the Director of the Center for Interdisciplinary Geospatial Information Technologies at Delta State University since 2005.

It goes on to say, “Mr. Brooks has been a career firefighter, volunteer firefighter, medic, and a U.S. Army Reserve Medical Service Corps Captain. Mr. Brooks has volunteered his services, time, and effort for significant events and crises such as the April 2011 tornado outbreak in Mississippi, the 2011 Mississippi flood, the 2011 Tohuku Tsunami, the 2010 Haiti earthquake, and Hurricanes Katrina and Rita in 2005, among many others.”

Working with the Bolivar County, Brooks took the Katrina information from storm track and “threw it on a map showing what the wind speeds would be for Bolivar County. They sent it down to MEMA and they called and asked if I could come down. We loaded all of our computer gear and went down to MEMA and said ‘hi, we’re here to help and by the way did you know we can do this geospatial thing?’”

Brooks arrived in Jackson on August 27 to work with disaster management agencies and immediately dug in with all the technology available to him and recreated the area to help rescuers rescue folks. With so many first responders coming into the area with no knowledge of it and no road signs and objects to use to direct them, Brooks had to recreate the coastal area and helped send rescuers to the exact locations they needed to be. He developed storm surge depth models to predict where operational managers could effectively put their resources. The tedious task is documented here.

He needed more help and put out a call to GISCorps Committee to find them. They came from all over the US – Ohio, Missouri, Florida, North Carolina, Illinois, California, New York, Texas, Arkansas, and Colorado and had an average work experience of 8 years. They had an immediate impact on the situation. Working with local universities, the team was involved in Search and Rescue efforts. More than 100 addresses/locations were translated into GPS coordinates for the US Coast Guard rescue helicopter evacuation missions. Many of these location to GPS translations could only be done fast enough using GIS. Many of the calls had scant information wot work with such as, “I’m trapped at the water treatment plant in _____” or “I’m about 1 mile north of _____ and I can see a church steeple”. Talbot and his team helped rescuers find them all. More than 5,000 linear feet of maps were created to help find critical infrastructure that was hidden due to storm debris. The team generated debriefing maps for responders and then President Bush.

“Geospatial technology is really a suite of technologies that uses GPS, aerial photography, satellite imaging, geographic information systems and visualization techniques to explore and understand problems from a spatial context,” Brooks said. “Everything is somewhere. Being able to put that where component into a problem set helps us better understand what’s going on and solve some pretty unique problems.”

He explained that when you call 911, your phone should appear on a map on a screen in front of a dispatcher. They then send the right department in the area that you are in. But Brooks understands how to take the information of perhaps every call that comes into 911 for a fire call spatially in this community for the last five years.

“When I start to look at this information on a map, I can start to derive some trends,” Brooks said. “Oh, I have a hot spot or lots of kitchen fires in this neighborhood. Now I can exploit that information. If I’m the fire chief and I see this particular neighborhood with a problem with kitchen fires, I can do a targeted fire prevention program and go door to door with cooking safety. By using that information on a spatial basis, I get a better understanding of what my problems are.”

At Delta State, Brooks and his team have created the “first true undergraduate degree built specifically to teach geospatial analysis and intelligence techniques. Most academic programs, students will major in environmental science or geography and then take four or five classes in geospatial technologies and call it a minor or a certificate. We’ve flipped that model.”

The new discipline is a full undergraduate program at Delta State University. It is the only one of its kind in the United States. The only one. Right here in the Mississippi Delta. Other schools have pieces of what he does but no one has the whole package to combine it all.

“This field wasn’t even around 30 years ago,” Brooks said. “We’re thrifty here with education. Our incoming Marines will be set up to earn a two-year degree at MDCC and then a four-year degree here at Delta State. We had our first student get his two-year certificate this summer. Delta State doesn’t offer everything online every semester.”

DSU leads the country. The National Geospatial Intelligence Agency and USGS recognized Delta State as a National Center for Academic Excellence. Other schools included West Point and George Mason.

“We keep some pretty interesting company,” he said.

The discipline is one fully used by the military and intelligence agencies and Brooks has plenty of military students. Some of whom are actually sending in homework from foxholes scattered around the globe. To help with the cost of education, DSU partners with Mississippi Delta Community College in Moorhead for military students to take classes at lower costs, get an associate degree and then transfer to DSU for a full degree. These two steps in education actually help lower costs for military while providing them steps up in their military pay.

“How are we keeping tabs on North Korea? Spatial technology. We have analyst in our program where that is literally their job. Because half of our enrollment is in active military we are helping them earn a college degree. They are online students and we also offer it in residence.”

But with the state not having much education dollars to spread around, Brooks has to make due with just under $78,000 a year to run his department. So, he pursues grants and private sector partnerships to help provide funding for internships. He has students helping some local counties build 911 databases and creating topographic maps for National Geospatial Intelligence Agency and Hexagon. Those maps are literally being “put in war fighters’ hands, their lives literally depending on my students building and learning a skill set. We are quite literally on the front lines with what we’re doing.”

Brooks and his students travel to conferences and meetings around the globe and the US learning more and more and brings that knowledge and connections home to the MS Delta to impart to students and faculty.

One of his students, Scout Mauch – yes, she’s named after the book character and goes by Scout, grew up on a farm in Arkansas and is looking to garner an education in this field to use in a possible agriculture career. One of the uses for farmers is to understand where to water and fertilize specific areas of their fields, helping lower the costs of farming.

“It shows the satellite imagery of your crops but it also goes into the military and government. Marines in the field using maps,” Mauch said. “It’s probably one of the fastest growing degrees. I’m still overwhelmed by all of the possibilities. I don’t even know what exactly I want to do with this yet or what job I want to get. I just have one foot in trying to figure it out.”

Since April, Scout has travelled to Tampa, Florida and Vietnam.

“We worked with the United Nations, well Talbot and Chris did, I was just a student on the trip, to help the Vietnamese Disaster Management Agencies with their capacity to use geospatial technologies,” she said. “I’ve been very busy since I got into this at Delta State.”

The degree has only been in place for a year and half at Delta State.

“We are after every student we can find,” Brooks said. “I have jobs for students who want to work their way through, we have scholarships we don’t have out of state tuition. I recruit all types – I love artistic students. They are good at cartography and visualization. People that like to write computer code, we have a spot for them, people that like to work outdoors, we have a place for you too. Here’s your GPS, go collect some data in the field, crawl around the woods and count bears – it’s a team of students with very diverse interests and diverse backgrounds. They end up going and doing everything from working with the intelligence community, MDWFP, the Army Corp of Engineers, local municipalities and consulting firms. They are in any and every discipline you can think of. I’m trying to place students with the Washington Post. Someone needs to create those cool maps and graphics that go in their publications and we teach that.”

To learn more about the program, contact Brooks at the Center for Interdisciplinary Geospatial Information Technologies

Delta State University

110 Kethley Hall

Cleveland, Mississippi 38733

Tel.: 662-846-4520

E-mail: tbrooks@deltastate.edu.

I’d always ‘preciate your comments here or over at Facebook, or you can tweet me @markhstowers … See yah next week! As a freelance writer, I’m working to grow my business and have created a GoFundMe page to help with that. Please take a look and see if you can help. I’d greatly appreciate it!

A Rebel, a Statesman — or Fightin’ Okra — and even a Trojan, I’m the Sunflower County farm boy with no green thumb who longed to live in the big city, got his wish and now is working his way back to the farm.

A freelance writer, middle-of-the-road-conservative and wannabe fry cook, I look to bring native Mississippi folks and businesses to your attention through my looking glass.

There are those of us that packed up Mississippi and took it with us to new destinations and neighbors. My area code may be 248 but my heart is all about 662, with plenty of room for the 601. Heck, I’ll even saunter into the 228 from time to time.

There’s more about me at markhstowers.com.

Top Ten Tech Trends 2018: There’s Value in AI, but Where Is the Value Greatest? – Healthcare Informatics

The spotlight has been shining bright on IBM Watson of late as healthcare stakeholders ponder how artificial intelligence can help solve some of the industry’s biggest problems

Editor’s Note: Throughout the next week, in our annual Top Ten Tech Trends package, we will share with you, our readers, stories on how we gauge the U.S. healthcare system’s forward evolution into the future.

Reading about the future of healthcare these days likely means there will be some reference to artificial intelligence (AI). It’s one of those “buzz terms” that is being used in a variety of ways across the sector, though applications are still quite early in most cases. But make no mistake—for healthcare stakeholders of all types, AI is a term that’s on their minds.

A big reason why AI in healthcare has become such a popular concept certainly is due to the mainstream media coverage of IBM Watson, an artificial intelligence supercomputer that was thrusted into the world of healthcare just a few years after it won in Jeopardy! against record-setting champions in 2011. Watson Health, a unit of IBM, was launched at the 2015 HIMSS conference and employs thousands of people. However, along with the popularity of Watson has come intense scrutiny, especially in the last year.

A STAT News report from September 2017 was one of the first major stories detailing how Watson has been performing in hospitals, specifically examining Watson for Oncology—a solution that aims to help physicians quickly identify key information in a patient’s medical record, surface relevant articles and explore treatment options to reduce unwanted variation of care and give time back to their patients.

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI…

But the piece found that Watson for Oncology has struggled in several key areas, noting that while IBM sales executives say that Watson for Oncology possesses the ability to identify new approaches to cancer care, in reality, “the system doesn’t create new knowledge and is artificially intelligent only in the most rudimentary sense of the term.” A more recent report, also from STAT, included internal documents from IBM Watson Health which indicated that the Watson for Oncology product often returns “multiple examples of unsafe and incorrect treatment recommendations.”

There was also one newsworthy story last year about a partnership between IBM and MD Anderson Cancer Center, part of the University of Texas, which soured to the point where the $62 million project for the cancer center to deploy Watson had been scratched. Lynda Chin, M.D., who oversaw the Watson project at MD Anderson before it fell apart, told STAT reporters that it was quite challenging to make the technology functional in healthcare. “Teaching a machine to read a record is a lot harder than anyone thought,” she told STAT, noting how her team spent countless hours trying to get the machine to deal with the idiosyncrasies of medical records.

Meanwhile, in a recent interview with Healthcare Informatics, Francine Sandrow, M.D., chief health information officer (CHIO) at the Corporal Michael J. Crescenz Veteran’s Affairs Medical Center in Philadelphia, notes that her team was working on a project with Watson, which was being used to identify patients who were at-risk for post-traumatic stress disorder (PTSD), but had not actually been diagnosed with it. This project focused on simply feeding their charts into the Watson engine, says Sandrow, who is involved in several Veterans Health Administration clinical informatics initiatives.

Unfortunately, she says, “They de-funded [the project] before we got to the results part.” She explains, “When you’re dissecting a chart, the first thing you have to do, when you’re training a computer to recognize [something], is define the terms that would be included as triggers for a particular condition. “So, for post-traumatic stress disorder, she continues, the high volume of terms meant that there weren’t too many charts that would be eliminated. In other words, there were too many indicators for the Watson machine to effectively pull out those patients at risk. “I’m not certain that they would be able to get the specificity that they were looking for. There’s a lot of subtle indicators for PTSD, and human behavior, that I think it would have clouded up the ability of the computer to recognize it, simply from the chart,” Sandrow says.

IBM, according to STAT, has reiterated to its customers that all data included in Watson for Oncology is based on real patients and that the product has won praise around the world for its recommendations. Discussions have also emerged on just how much the company should be blamed—versus the end user—for implementation struggles. To this point, Leonard D’Avolio, Ph.D., an assistant professor at Harvard Medical School and CEO and co-founder of healthcare technology company Cyft, notes, “Who is at fault there? IBM or the provider team that bought the product for marketing and hoped it would fulfill a vision?”

Of course, Watson is just one example of an AI technology that has sparked debate, but due to IBM’s immense industry standing and given how the tech giant has marketed Watson, for one of its top tech trends this year, Healthcare Informatics sought out to ask industry leaders what they were seeing and hearing about the AI supercomputer, and how its performance has affected the broader artificial intelligence landscape.

Humans versus Computers

Bill Kassler, M.D., is the deputy chief health officer at IBM Watson Health, and as a physician, he offers a unique dual-perspective on AI as he comes from both ends of the spectrum: a healthcare practitioner and a technology solution company executive. When asked about the skepticism that has surrounded Watson of late, Dr. Kassler says that in general in healthcare, doctors, hospital administrators and other decision makers are conservative and operate in resource-constrained ways. “They are skeptical about technology, drugs, and anything else that’s new. That’s the baseline culture.”

Bill Kassler, M.D.

Kassler contends that even though IBM must work around this challenge, its AI offerings remain quite popular worldwide.Indeed, IBM Watson Health’s Oncology and Genomics business has doubled in revenue year after year since 2015, and its AI offerings are now being used in more than 230 hospitals around the world. Last year at this time, that number was just 55 hospitals, he says.

For traditional physicians, one of the primary critiques with AI is that the computer’s treatment recommendations may differ from the doctor’s. For instance, a physician that makes decisions based on decades of experience might not take too kindly to a computer recommendation that the doctor firmly believes is not the best option for the patient.

Kassler says he gets asked this question frequently, and attests that studies have been done on how often the Watson computer agrees with a panel of patient care experts. He references one particular study, published last year in the journal The Oncologist, that was led by oncologists at the University of North Carolina’s Lineberger Comprehensive Cancer Center. The oncologists tested Watson for Genomics on more than 1,000 retrospective patient cases. More than 99 percent of the time, Watson agreed with the physicians, but beyond that, in more than 300 cases, Watson found clinically actionable therapeutic options that the physicians had not identified.

To this point, Kassler acknowledges that if the technology simply always agrees with the human, there is “limited utility.” While it can improve unwanted variation and quality, “what you really want is for that system to surface new insights,” he says. In a separate study of Watson for Oncology that Kassler mentions, inclusive of nearly 2,000 high-risk breast cancer patients, 30 percent of the time, Watson identified a new tumor mutation and had actionable recommendations.

As such, Kassler says, “If there’s a conflict [between computer and human], our hope is that Watson will deliver a list of recommended treatment options, the doctor will look at that and [compare] what his or her patient has with the other factors that Watson has included, and will then choose to accept the computer’s recommendations or not. And then the doctor will tell Watson why he or she made that decision so that Watson can learn from it,” he explains.

Expanding on this point, Yan Li, Ph.D., an assistant professor of information systems and technology at California-based Claremont Graduate University, notes that most AI technologies are in the form of a black box—that is, providing an output (recommendations) from a set of inputs without an explanation as to why. “It is very difficult for an experienced clinician to trust such an output without a logical explanation, especially if the output is different from his or her experience-based judgment,” Li asserts.

Is it Worth the Battle?

More broadly speaking, the reason why so many innovators are bullish on leveraging AI in healthcare has to do with the computer’s learning or computation capabilities—specifically its speed and volumes in consuming information, Li says. “To provide high-quality care, medical practitioners must continuously update their clinical knowledge and keep current with the research literature,” she says, referencing a study that estimated it would require a physician approximately 627.5 hours per month to evaluate newly published research in primary care. But for computers, Li says, “processing this literature would take a matter of a few hours, and even less if we horizontally scale up the computation power.”

At the same time, there are a fair share of challenges, beyond the aforementioned trust issue. Li believes that in their current state, most AI solutions require training. “It is not the computer; rather, it is the computational algorithm that is trained based on historical data, and then makes predictions, classifications, or inferences based on input data. AI algorithms fall short in not considering relevant clinical information that may not be captured in the training data,” she says, offering an example of a diagnostic conversation between the patient and the clinician.

There is additionally a fear conundrum: the concern that AI technologies will eventually diminish the need for certain human jobs as they have begun to do in many other sectors. But the experts interviewed for this piece believe that this apprehension is mostly unwarranted. “It’s not a valid fear. It’s just something that sells stories because talking about replacing humans is something that’s super interesting,” says Cyft’s D’Avolio. Sanket Shah, an instructor for the University of Illinois at Chicago’s Department of Biomedical and Health Information Sciences, agrees with D’Avolio, noting, “Physicians need not fear being replaced by AI. Physicians are the providers of care and AI is one of the many tools they use to administer that care and improve their craft.”

Leonard D’Avolio, Ph.D.

In the end, when all the concerns and potential benefits are added altogether, most experts are still bullish on how AI can provide key clinical decision support to improve patient outcomes and lower costs. D’Avolio believes that many health system leaders have recently broadened “what was once a narrow view of AI and machine learning within their organizations.”

What’s sorely needed, most leaders in this space agree, is better education on how AI offerings exactly will work in healthcare organizations. And in this sense, Watson’s successes and failures can be used to learn lessons moving forward. In the first STAT report, the authors wrote, “The actual capabilities of Watson for Oncology are not well-understood by the public, and even by some of the hospitals that use it.”

Of course, at what level a provider might leverage AI might also depend on several other factors. IBM’s Kassler notes, “If you are a small, one-person family practice in rural Vermont that is now just starting to use Excel spreadsheets for population health registries, yes, it’s too early [to start using AI]. But if you are a large integrated delivery network looking to invest in and be part of the development and perfection of this technology, it’s a great time,” he says.

As such, it makes it tough to answer if AI is at a crossroads in this current moment, and this will probably be a meaningful health IT trend in the years to come. As Kassler acknowledges, “For those on the leading edge, it’s a great time to get involved, but it’s not for everyone.”

Industry Groups Urge CMS to Reform Stark Laws, HHS Considers Reforming Anti-Kickback Statute – Healthcare Informatics

Nearly every day, it seems, new business combinations are announced that are threatening to alter the landscape of U.S. healthcare forever. CVS’s acquisition of Aetna, completed last November; the announcement a year ago now that the executives of Amazon, Berkshire Hathaway, and JPMorgan Chase & Co. were launching a broad (if not well defined) initiative to improve consumer satisfaction and reduce costs for their employees; Cigna’s acquisition just last month of pharmacy benefit management (PBM) company Express Scripts; and Amazon’s acquisition last summer of online pharmacy company PillPack.

Every one of those business deals represents a disruptive move in U.S. healthcare, with unalike “species” of organizations combining with one another. And now, the retail drugstore giant Walgreens Boots Alliance Inc. and Microsoft Corp. are coming together in yet another disruptive venture. As Managing Editor Rajiv Leventhal wrote in an article on Tuesday, the corporations “are joining forces on a major seven-year healthcare partnership that will aim to ‘deliver innovative platforms that enable next-generation health networks, integrated digital-physical experiences and care management solutions.’” As he wrote, “The companies announced today that they will combine the power of Microsoft Azure, Microsoft’s cloud and AI (artificial intelligence) platform, healthcare investments, and new retail solutions with WBA’s customer reach, volume of locations, and outpatient healthcare services to accomplish their goals: to make healthcare delivery more personal, affordable and accessible.”

As Leventhal noted in his report, “While innovation in healthcare has occurred in pockets, officials of the two companies believe that ‘there is both a need and an opportunity to fully integrate the system, ultimately making healthcare more convenient to people through data-driven insights.’” Further, he noted, “As part of the strategic partnership, the companies have committed to a multiyear research and development (R&D) investment to build healthcare solutions, improve health outcomes and lower the cost of care. This investment will include funding, subject-matter experts, technology and tools, officials noted in the announcement. The companies will also explore the potential to establish joint innovation centers in key markets. Additionally, this year, WBA will pilot up to 12 store-in-store ‘digital health corners” aimed at the merchandising and sale of select healthcare-related hardware and devices.

“This gap creates an opportunity for the pharmacist to help monitor the patients’ health and prompt the patient to receive preventative care in the retail clinic or through a virtual care visit. Using an enterprise health cloud, like Azure, you create a more connected ecosystem so that we can share that data with the patient’s additional providers, track outcomes, and intervene earlier when an issue arises,” Microsoft CEO Satya Nadella said in a statement Tuesday.

And, Leventhal wrote, “Notably, the companies will also work on building an ecosystem of participating organizations to better connect consumers, providers—including Walgreens and Boots pharmacists—so that major healthcare delivery network participation will provide the opportunity for people to seamlessly engage in WBA healthcare solutions and acute care providers all within a single platform.”

Speaking to the difference between retail pharmacies and traditional care providers, Forrester analyst Arielle Trzcinski said in a statement emailed to the press that “[R]etail pharmacies offer an opportunity to engage with the patient much more frequently than at an office visit, giving an example of how chronic care patients see their pharmacist frequently, while some figures indicate that the average diabetic patient sees his or her provider once every six months.”

The implications of all of this are, of course, huge. For one thing, if one were to ask the average patient/healthcare consumer with whom they interacted more, doubtless, the vast majority would cite their retail pharmacists, rather than their primary care physicians. What’s more, what happens if Walgreens is able to follow through, as CVS also intends to do, in creating minute clinics in retail pharmacy locations? The impact could be revolutionary.

Indeed, it’s no secret that many patients are dissatisfied with the cumbersome, challenging processes around accessing primary and specialty care in the U.S. healthcare system. Simply accessing a timely appointment often proves to be a major hassle; and encounters around needed follow-ups and around questions to doctors and nurses often turn out to be such a hassle that many patients simply give up, with the result of medication non-compliance and other issues.

So what will happen if Walgreens, like CVS, manages to achieve success with one or more elements of this initiative? Those could include enhanced continuum of care for patients, especially those with chronic diseases; improved communication among all care delivery stakeholders; and enhanced patient/consumer satisfaction.

A few stakeholder groups should be paying particular attention here, including practicing physicians and healthcare IT leaders. For practicing physicians, could anyone deny that this business initiative, along with the others mentioned above, should be disconcerting at the very least? Already, patients needing relatively immediate medical attention, are turning en masse to urgent care centers, as both health systems and health insurers are working to cut down on the volume of emergency department visits, which are tremendously expensive, and which burden the healthcare delivery system in ways that are not sustainable. But now, with both Walgreens/Microsoft and CVS/Aetna, is anyone denying that the era of pretty-close-to-immediate medical attention is on the horizon?

The reality is that, while most patients like their primary care physicians and are satisfied with their care overall, strong majorities, in polls, continue to complain about poor service, bad communication, and delays accessing care and accessing follow-up support. What happens when most decent-sized Walgreens and CVS drugstores are staffed up with PCPs or advanced practice nurses, to handle the colds, coughs, flus, strep throats, and minor skin and digestive issues that could easily be handled by such service offerings?

One of the core policy issues here is that the U.S. healthcare payment system remains largely predicated on primary care physicians physically touching patients in order to get paid. Yes, telehealth services are expanding daily; but in most situations, patients still need to go through the awkward, inconvenient, sometimes even-arduous process of scheduling an appointment, using some form of transportation to get to that appointment, and waiting in a crowded physician office, in order to access primary care. But in 2019, when GrubHub can deliver one’s banh mi Vietnamese sandwich to one’s home, and Amazon is sending everything from books to clothing to furniture to God-knows what, directly to people’s doors, how much longer will healthcare consumers continue to be patient with the glacial pace of care delivery change in U.S. healthcare?

Meanwhile, healthcare IT leaders will inevitably find themselves somewhat behind a proverbial eight-ball on all this, caught between the intensifying demands on the part of practicing physicians, especially primary care physicians, for full clinical IT support for their practices, and constant business changes, including merger-and-acquisition activity in their own organizations that is continuously scrambling their long-term planning.

So we’re seeing both business and technology changing, and changing quickly, with numerous examples already of industry-disruptive business combinations, and technology advancing to the point where previously unimagined breakthroughs are now imaginable. For example, Walgreens and Microsoft noted that, “Through this agreement, Microsoft becomes WBA’s strategic cloud provider, and WBA plans to migrate the majority of the company’s IT infrastructure onto Microsoft Azure,” as corporate officials put it. And “Microsoft also plans to roll out Microsoft 365 to more than 380,000 Walgreens employees and stores globally.” And, to make things just that more intriguing, the announcement quoted Stefano Pessina, executive vice chairman and CEO of the Walgreens corporation, as stating that “WBA will work with Microsoft to harness the information that exists between payors and healthcare providers to leverage, in the interest of patients and with their consent, our extraordinary network of accessible and convenient locations to deliver new innovations, greater value and better health outcomes in health care systems across the world.”

As renowned Chicago architect Daniel Burnham so famously said, “Make no little plans; they have no magic to stir men’s blood and probably themselves will not be realized.” There’s no question that the senior leaders of all of these business alliances, combinations, and initiatives are going to be “no little plans.” It would behoove clinicians, clinician leaders, healthcare IT leaders, and all c-suite leaders in provider organizations to think Burnham-sized thoughts; these businesspeople from outside traditional healthcare delivery are certainly doing so.