2018 Innovation Issue: There’s promise in cryptocurrencies but plenty of skepticism, too – Indianapolis Business Journal

Cryptocurrency might not be coming to a bank near you—yet. But bitcoin ATMs are springing up across central Indiana and the nation, and some tech leaders say that, within a decade, cryptocurrency could be more life-altering than the internet.

“Cryptocurrency and blockchain technology—the engine that runs cryptocurrency—is coming so fast and will disrupt so many businesses and industries, a lot of people have no idea,” said Patrick Sells, founder and president of Sells Group, an Indianapolis digital marketing firm that took a hard turn into cryptocurrency last year.

“It’s coming faster than artificial intelligence, virtual reality or autonomous vehicles.”

Julien Nadaud, chief product officer at Carmel-based software company Determine Inc., called cryptocurrency “a revolution.”

“It’s going to change the way people and businesses interact,” he said.

While bitcoin ATMs—which are essentially exchange stations—are just the tip of the cryptocurrency iceberg, they are a sign of how ubiquitous this emerging technology could become.

Central Indiana has at least three bitcoin ATM operators and about two dozen machines—most popping up in the last two years, in places as conspicuous as the Circle Centre mall food court.

“We started with four [bitcoin ATM] machines in Indianapolis and we’ve quickly grown to 10,” said Michael Dalesandro, founder and CEO of Chicago-based RockItCoin. “We continue to see growing adoption in your market and others.”

So what exactly is cryptocurrency and what is its appeal?

The first question is relatively easy to answer. The latter depends on whom you ask, but its biggest draws are its decentralized, democratic organization, and its antithetical pairing of fully transparent transactions conducted in complete anonymity.

Let’s start with defining cryptocurrency.

It’s a self-regulated, decentralized digital asset designed as a medium for exchanging goods and services. Put another way, cryptocurrencies such as bitcoin are an accounting system.

Bitcoin uses techniques for secure communication called cryptography to record transactions and control the creation of the virtual currency. The photos you see of actual bitcoins are nothing more than a representation. No physical bitcoin or any other cryptocurrency actually exists.

So how is it different from standard currency?

Electronic payment systems have been part of American life since 1871, when Western Union introduced money transfers through the telegraph and in 1914 introduced the first charge card.

Modern-day cryptocurrencies differ from those in one essential way: They don’t represent a claim on value; they are the value.

Cryptocurrencies are not controlled by a central authority such as a financial institution or government. Instead, the ledger tracking the cryptocurrency—known as the blockchain—is maintained by myriad computers run by what are known as miners.

The miners, who can be anyone with access to a good computer and the appropriate software, solve math calculations—purposefully complex ones—that are used to record and verify transactions to maintain the blockchain. The miners whose computers first solve the calculations are rewarded with allotments of the cryptocurrency.

In almost every room of Sells Group’s Indianapolis office are computers the company built for the sole purpose of mining cryptocurrencies, such as bitcoin and ethereum. Each rig contains about $7,000 worth of hardware, Sells said.

While cryptocurrency work is still a small part of Sells Group’s business, it’s fast-growing, Sells said. His firm’s “mining rigs” have handled individual calculations in the last year that earned the firm cryptocurrency worth six figures when translated into U.S. dollars, he said, making his company “one of the largest cryptocurrency mining operations in Indiana.”

Anonymity, yet transparency

In traditional payment networks like banks, credit cards, Paypal or Venmo, an account is linked directly to the user’s name and oftentimes a government-issued identification, like a Social Security number. The bitcoin blockchain handles accounts, transactions and privacy much differently.

Anyone can create and store a bitcoin account, called a digital wallet, on a computer or smartphone—without a name, address or email address. It’s nearly impossible to track who owns which account. In fact, some transactions can even be conducted without an internet connection.

But even though the people transacting business are anonymous, the transactions themselves are wide open. Everyone who installs the proper software can see the entire blockchain and can track and verify transactions. Every piece of cryptocurrency is accounted for, making sure no one is giving or getting counterfeit bitcoin. The balances of all accounts are visible on the blockchain—but the owners’ identities are encrypted.

By monitoring and updating the ledger in a collective, consensus-based system, the need for a broker or middleman acting as a repository of information is eliminated. That does away with fees, inefficiencies and the potential for corruption and risk that comes with centralizing information, cryptocurrency advocates say.

“The idea that there’s data somewhere in a network that no one person has authority over, nobody can hack—not even inside jobs—that didn’t exist until bitcoin’s blockchain,” said Erik Townsend, a hedge fund manager who is a noted bitcoin expert and MacroVoices.com podcast host. “I do think this is going to change the world in a big way.”

In the beginning

The genesis of cryptocurrency can be traced to what is called the cyberpunk movement in the 1990s. Cyberpunks were a group born out of a love of the internet, cryptography, the possibility of privacy, and a new way of doing business.

But it took the Great Recession and banking crisis to lay fertile-enough ground for the birth of Bitcoin in 2009. Launched by an anonymous person or group going by the handle Satoshi Nakamoto, bitcoin was the first decentralized cryptocurrency. Since then, more than 1,800 others have been created, but only a handful are commonly used.

Bitcoin is the largest by market value—the number of “coins” on the market multiplied by their dollar value—at $128.5 billion, as of May 30. The next five are ethereum ($56.7 billion), ripple ($24.1 billion), bitcoin cash ($17.3 billion), EOS ($10.9 billion) and litecoin ($6.8 billion).

Cryptocurrencies come into existence in a process similar to taking a company public. Through an initial coin offering, or ICO, value is established and the foundation stock is issued. There’s one big difference between an IPO and an ICO. An ICO is unregulated because—for now—federal authorities don’t consider most cryptocurrencies securities.

After the initial offerings, cryptocurrencies continue to be issued to the miners maintaining the blockchain until a maximum amount—pre-set by the cryptocurrency’s creator—is reached. In bitcoin’s case, Satoshi Nakamoto set the limit at 21 million pieces to be issued by 2140.

The basic rules of supply and demand set the value of cryptocurrencies. And those values fluctuate wildly.

For instance, a single bitcoin was valued at more than $19,300 in mid-December; by April 1, the value had dropped to less than $7,000. Still, that’s a lot higher than the $1,968 it was trading at a year ago. In 2012, you could snag a bitcoin for less than $100.

Acceptance as currency

While cryptocurrency experts said few Indiana companies deal in or work with cryptocurrencies, a growing number of companies nationwide do accept them.

Overstock.com, Lamborghini, Tesla and even Victoria’s Secret—to some degree—deal in cryptocurrencies.

Sells said his company is developing a business-to-consumer rewards program using cryptocurrency for one of its clients, Indianapolis-based Jordan Standard Distribution, and will announce a similar project for another client this summer.

Many U.S. banks initially dismissed cryptocurrencies, but far fewer today doubt its potential impact.

“I think the concept is valid. We are looking into that space. You have many central banks looking into it,” JPMorgan Co-President Daniel Pinto told CNBC in May. “The tokenization of the economy, for me, is real.”

Some see it as a potential threat to the banking industry.

“Clients may choose to conduct business with other market participants who engage in business or offer products in areas we deem speculative or risky, such as cryptocurrencies,” Bank of America said in a 10-K filing earlier this year. Such increased competition may “negatively affect our earnings” or affect “the willingness of our clients to do business with us.”

Craig Fortner, first vice president of information technology for the Fishers-based First Internet Bank, said First Internet Bank executives “see promise” in the underlying blockchain technology—which can be used in many ways—but remain “neutral” on cryptocurrency.

First Internet Bank is “unlikely to dive in” to cryptocurrencies because they are not regulated, Fortner added.

Regarding bitcoin, “you cannot have something where the business proposition is to be anonymous and to be the currency for unknown activities,” Pinto said. “That will have a very short life, because people will stop believing in it, or the regulators will kill it. Cryptocurrencies are real but not in the current form.”

Still, he added: “I have no doubt that in one way or another, the technology will play a role.”

Undeniable upside

Proponents say the upside of cryptocurrency and blockchain technology is overwhelming.

Fortner said cryptocurrency could offer a uniform way to transfer a currency globally in real time—something that is currently impossible.

The decentralized, distributed ledger behind cryptocurrencies makes it nearly impossible to hack—at least with today’s technology. Not even an administrator could defraud the blockchain, as could—and does—happen with a centralized system.

And since the blockchain is housed on multiple computers, it can’t simply crash or be taken down.

Cryptocurrency proponents love that it removes a middleman, and the accompanying fees.

“Before, if you wanted to send something of value across the internet, you had to get someone else involved. You had to have a credit card company or Paypal or maybe a bank involved in the transaction,” said Gavin Andersen, founder of Washington D.C.-based Bitcoin Foundation and a forefather of the cryptocurrency movement.

“The promise of bitcoin is that you’re directly sending this currency to another person and then the bitcoin network performs the function that normally Paypal, a bank or a credit card company would perform,” Andersen said.

But because maintaining bitcoin’s blockchain is lucrative, some early adopters predict banks and governments will eventually try to take over.

Global IT firm Infosys, which last year opened a hub in Indianapolis and plans to expand here, recently announced it is setting up a blockchain for seven banks in its home nation of India. Some say that’s a foreshadowing of the banking industry’s trying to muscle in and take over the citizen-run cryptocurrency systems.

Investment or currency?

Because of their relatively high value and volatility, cryptocurrencies like bitcoin are “more like a bar of gold than a dollar bill,” Fortner said.

Nakamoto clearly designed bitcoin as a means to trade goods and services, not merely for investment purposes. Most cryptocurrencies, including bitcoin, are completely transferable and every piece of bitcoin can be broken down into 100 million pieces, so there’s room for it to expand to small-value transactions.

While few U.S. citizens find immediate need for cryptocurrencies, a global cryptocurrency has real allure for people and businesses in parts of Asia, Africa and Central and South America where access to banking is limited and the government-issued currency is unstable.

About 2.5 billion adults worldwide don’t have bank accounts, and cryptocurrency advocates say the new technology gives people opportunities to have a bank within their cell phone.

As more parts of the world adopt cryptocurrencies, global companies in the United States will have to consider using them, said John Sarson, principal with Blockchain Momentum LP, an Indianapolis-based hedge fund that invests in blockchain technologies and cryptocurrencies.

“Places like Miami and Los Angeles are already cryptocurrency hotbeds,” Sarson said. “In Miami’s case, it’s because Latin America has picked up on it due to the instability of its currency.

“Right now, Indiana is standing flat-footed when it comes to cryptocurrencies,” he said. “There is no understanding of the impact this is going to have by business executives here.”

Sarson said if Indianapolis is going to continue to build its reputation as a tech hub, it “needs to get on board.”

But before bitcoin and other cryptocurrencies explode globally, one major limitation in the technology needs to be conquered.

“Scalability is an issue,” said Karthik Kannan, management professor at Purdue University’s Krannert School of Business. “Because of the complexity of the math required to maintain the blockchain, you can only do so many transactions, and that is a limiting factor to this technology.”

The bitcoin blockchain can handle about seven transactions per second. That pales in comparison to the 750 wire transactions per second Western Union can spit out using traditional currencies or the 25,000-plus transactions per second Visa can handle.

“Almost any consumer knows the frustration of waiting even 10 to 15 seconds for their Visa transaction to go through,” Kannan said. “If bitcoin or any other cryptocurrency were ever used by the masses, it would be many, many times worse.”

In May 2017, as activity on the bitcoin blockchain heated up, it took up to four days to complete a transaction.

Security worries

Speed isn’t the only concern. Some might see banks, credit card companies and payment systems as unneeded middlemen, but they offer an undeniable layer of security not present in unregulated cryptocurrency.

bitcoin graphicWhile the underlying blockchain itself is secure, the digital wallets that store currency can be compromised. If a laptop or cell phone housing a digital wallet is ruined or stolen, the bitcoin is gone. And tech experts say never keep a fully stocked digital wallet on hardware that is connected to the internet. That can too easily be hacked and your cyber loot heisted.

Keeping it on a thumb drive or a hard drive not connected to the internet—and in a fire-proof safety deposit box or safe is best—then download only the amount of cryptocurrency you need immediately to a wallet connected to the internet, security experts said. When a person loses track of his or her digital wallet, there’s no way—because there’s no central monitor—to claw the cybercash back.

James Howells of Great Britain says he lost 7,500 bitcoins when he accidently discarded his computer hard drive in 2017. As the value of those bitcoins grew to $75 million, he petitioned government officials in his hometown of Newport, South Wales, to let him dig up—at his own expense—his neighborhood landfill. That request was denied. Even by today’s standards, Howells’ lost bitcoin is valued at $60 million.

Transactions with the ATMs that are sprouting up must also be treated with care. Cryptocurrency that is transferred into a digital wallet via the ATMs should be quickly moved into a more secure off-line digital wallet. Some users even opt to carry paper with QR codes affiliated with their bitcoin information—preferring not to scan a QR code from their digital wallet at the ATM or fearful of having a large amount of cryptocurrency on their cell phone.

Users can then scan the QR code on paper to transfer that loot to an off-line storage device.

But that can be dangerous. RockItCoin’s website warns: “Be careful with the paper wallet since it holds all the funds sent to that … address. Transactions are irreversible so if a paper wallet is lost, damaged or destroyed the funds are LOST FOREVER!!!”

Tales from the dark side are not limited to individual users.

In 2014, a well-known Tokyo-based bitcoin exchange and depository, Mt. Gox, suffered an irreversible collapse.

Mt. Gox announced that approximately 850,000 bitcoins belonging to customers and the company were missing and likely stolen, an amount valued at nearly $500 million at the time.

“The lesson of Mt. Gox is, you don’t leave your bitcoin with an untrustworthy third party,” Kannan said.

Regulation gyration

Government officials at every level are struggling with how to regulate cryptocurrencies. New York and several other states require a license to deal in cryptocurrencies. But that has chased away a number of tech companies, a scenario many regions would like to avoid.

There’s also been a crackdown on initial coin offerings when the Securities and Exchange Commission deems the organizers are packaging cryptocurrencies as securities or making promises to potential investors or users they can’t keep.

Indiana Secretary of State Connie Lawson in May announced that her office has taken an enforcement action as part of an international crackdown on fraudulent ICOs and cryptocurrency-related investment products. The action, called Operation Cryptosweep, is being coordinated by the North American Securities Administrators Association.

The creators of bitcoin and other cryptocurrencies are worried government officials will eventually outlaw non-regulated cryptocurrencies and come up with a government-issued option.

The privacy and anonymity afforded by cryptocurrencies is a magnet for unsavory people and businesses like drug buyers and sellers and unlicensed and unauthorized arms dealers. The anonymity factors also mean cryptocurrencies can be a means of trying to avoid paying taxes. That has raised the antennas of government and law enforcement officials.

Many cryptocurrency advocates say U.S. and other government officials will use the threat of terrorists using cryptocurrencies as a reason to shut them down. But those same advocates say the governments have other reasons to want control.

“There’s so much government can do with digital currency that they can’t do with traditional currency,” said Townsend, the MacroVoices.com podcast host. “With government-issued digital currency, every single transaction is traceable and controllable by the government.

“Every single payment has to have the tax ID of the payor and the payee,” he said. “Every single transaction can be voided or clawed back by the government. It’s the opposite of bitcoin. I’m predicting a Libertarian holocaust. It’s a horrible thing, and I hope I’m wrong.”•

Read more Innovation Issue stories.

The IT and Data Analytics Drive – Healthcare Informatics

As providers plunge further into risk-based contracting, data analytics and strong IT foundations are seen as critical success factors

As challenging as it is for the leaders of hospitals, medical groups and health systems to strategize broadly around the plunge into risk-based contracting, strategizing around the information technology foundations and data analytics to support that journey is turning out to be equally challenging. That is the verdict of leaders from across the spectrum of U.S. healthcare, and from the hospital, physician group, integrated health system, and health plan sides of the table.

Fundamentally, says Shawn Griffin, M.D., vice president, clinical performance improvement and applied analytics, at the Charlotte-based Premier Inc., “The response to risk is trying to increase control, and our data systems have not been organized to give us total control over processes. The challenge,” Griffin says, “is that your insurance company you’re contracting with controls and owns the data; data ownership is an important concept. And even doctors aren’t all on the same EHR [electronic health record], who are in the same network. And now that the system of care isn’t just hospitals, but outpatient and post-acute as well, you have to build data and IT governance” around participation in value-based healthcare contracting. “You have different metrics for different claims and clinical data types; and the fact that we’re trying to bring in different types of data and tell the same story with them, is difficult. And the lack of interoperability is a huge challenge,” he says.

“I feel like we’re making progress on the claims data, because organizations are getting better at doing claims analytics; we’re now beginning to be able to use claims data to identify, for example, who the high-risk and high-cost patients are,” says Joe Damore, Griffin’s colleague and a vice president at Premier Inc. “But I still see a huge challenge in the lack of interoperability among EHRs. I don’t see anyone who’s mastered the situation yet of networks that are using multiple EHRs.”

“We have a sketch of interoperability that often involves dumbing down the information you share among EHRs. Almost nobody talks about claims interoperability,” Griffin adds. “Medicare Advantage versus commercial plans, multiple Medicare Advantage plans, all are different versions of claims data. We all have a phrasebook for a foreign language,” he adds.

Health plan leaders agree that there are some very fundamental challenges involved, including on the payer side. Speaking of the challenges for providers partnering with health plans, in marrying clinical and claims data, as well as simply in getting data to providers in a timely way, Chris Jaeger, M.D., vice president of accountable care innovation and clinical transformation at the San Francisco-based Blue Shield of California, says, “Having been on both the provider and plan sides, those are definitely real hurdles. It speaks to immaturity in master data management. And even when there’s more mature enterprise master data management, it will vary across organizations, so that’s a huge problem. And with respect to data timeliness, one challenge relates to plans sharing adjudicated claims data, where inevitably there’s a lag. On the provider side,” he adds, “my last experience was with PPO shared-savings contracts, and we had problems with timeliness and accuracy of data from plans, and sometimes just in terms of the master data management.”

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI…

Chris Jaeger, M.D.

What’s more, Jaeger says, data integrity remains a core challenge, in all situations involving health plans sharing data with provider organizations. “We were seeing data integrity issues that we needed to fix, before we could marry the plan data with our clinical data. And a lot of vendors will say they have the capability to deal with that, but the devil is in the details. So, we’ve been partnering with some of our provider partners, sitting down with them, with their resources, as well as with the partnering population health vendors, to improve how the data is moved and used, so they can do a better job.” What’s more, he says, “Data management doesn’t sound like a sexy value proposition, but it ends up being of incredible value. So really, if an organization is able to cleanse and aggregate data from multiple sources and bring the data into its analytics, you get better results.”

Getting Physicians Engaged in the Broader Effort

When it comes to getting physicians in practice engaged and motivated to support ACO (accountable care organization) and other value-based healthcare initiatives, the challenges are manifold, says R. Todd Richwine, D.O., chief medical informatics officer at the Texas Health Physicians Group, the 757-physician umbrella physician group attached to the Texas Health Resources integrated health system, which is based in the Dallas suburb of Arlington. Asked what he’s learned in the past few years around this, Richwine says, “Overall, that this is all too complicated. And that’s a big part of why I got into this role. When I came into our EHR from an older one that frankly was simpler and easier to use, I was astounded by the complexity, as an end-user. I went to a one-day conference, and our chief system clinical information officer said that we need to make the right thing, the easy thing to do. Doctors want to do the right thing, but if it’s too complicated, they’ll not do that. So, my guiding principle as CMIO has been to make the right thing the easy thing to do. We don’t want to add to the confusion or workload of physicians.”

R. Todd Richwine, D.O.

Still, Richwine says, “I’ve really enjoyed” working with fellow physicians on clinical IT development around clinical performance improvement efforts. “As I’ve been able to go out to physician clinics and talk about improving quality measures, with rare exceptions, our physicians are interested and motivated to improve their outcomes, especially around chronic conditions like diabetes and hypertension; they very actively look for those patients who aren’t following up or who are falling outside the parameters on a regular basis. Once I show them the tools and techniques, they get very interested and get their teams involved in improving outcomes for those patients.”

“The success of value-based care can only come if your physicians are truly involved,” says Sohail, the CIO at the Dallas-based Premier Management Company, a firm that organizes and manages ACOs (and is unrelated to the Charlotte-based Premier Inc.). “And unless and until you are able to provide them no more than three actions they’re supposed to take, around a particular patient, in order to be successful clinically or financially, they’ll never be able to execute,” says Sohail (who uses one name only). “So keeping value-based care as simple as possible, and as smart as possible, is key, knowing that physicians have very limited time and are running on a treadmill, and building systems around them that can optimize their work, and show them that if they adopt the system, they’ll be successful, and if not, they won’t,” he says.

Meanwhile, fundamental IT foundational issues remain, says Premier Inc.’s Griffin. “You need interoperability to make this work, and interoperability requires connectivity, so you need to be fleshing out your connections with all your providers, and you need to get the wiring down. And you’ve got to be working with your physicians, and make sure your clinicians are at your table as your building out your plan, and solving problems, not just increasing responsibilities. There’s no magic bullet, but there are islands of competency, where leaders of patient care organizations are doing this well, and sharing information with others.”

What’s more, says Premier Inc.’s Damore, “I also think you need to take an interdisciplinary approach to governance of IT. The CMIO and CIO can lead that effort, but shouldn’t be doing it in isolation. You need clinicians, your quality leader, your financial leader, and your clinical integration leader. So you need multiple people at the table. And you need to develop a roadmap that’s logical.”

“I think the biggest piece that I’d point to is that, through all of the iterations around value-based healthcare, and various programs, etc., there really is a handful of core building blocks and common set of needs around familiarizing themselves with data aggregation, predictive analytics, and performance measurement,” says Laurie Sprung, Ph.D., vice president, consulting, at The Advisory Board Company, the consultative firm based in Washington, D.C. Given that, Sprung says, “Healthcare IT leaders need to familiarize themselves with those elements. And many technology people have a philosophy of how they want their technology infrastructure to look and how the pieces relate to each other. I get that, they want to rationalize all the elements. But if you start out focusing on where your organization needs to go with this, you’ll be better off, because the technology is not in its final form yet, so it’s not just what the technology does, but how it aligns with the building blocks of creating a value-based care delivery system,” she emphasizes.

Laurie Sprung, Ph.D.

And the IT leaders of organizations that are early on the journey into value-based healthcare are already learning important things, says Michael Restuccia, vice president and CIO at Penn Medicine, the multi-hospital system based in Philadelphia. “First and foremost,” he says, “we learned what the definition of a readmission was. And I think the institutional knowledge and agreement of what a readmission is and is not, was a big learning for us. And we learned that, at times, different parts of our organization had different definitions of it.” In other words, he says, “We’ve learned that we needed to standardize our definitions, in order to modify our behaviors. And the IT system is the glue that holds that together.”

“Organizations need to focus on enterprise data governance and data management, as a key capability that can be built up; and they need to take a leap of faith and begin trusting the health plans more, with respect to data sharing,” says Blue Shield of California’s Jaeger. “When I was on the provider side as a CMIO, and we started talking about sharing clinical data with plans, there was a lot of fear that the data would be used against us, in terms of competitive advantage we had in relation to competing provider organizations. So the key is to take that leap of faith and share data with health plans, understanding that the health plans have legal constraints with what they can do, too, with respect to HIPAA (Health Insurance Portability and Accountability Act of 1996). So focusing on value-based incentives and quality improvements, will help the people we’re both trying to help—the patients.”

I know what you’ll do next summer – The Economist

Algorithm blues

The promise and peril of big-data justice

Can algorithms accurately predict where crime will occur?

EIGHT storeys above downtown Los Angeles, Sean Malinowski, deputy chief of the Los Angeles Police Department (LAPD), focuses intently on a computer map of his old stomping ground. Nestled between Burbank and Santa Clarita, the Foothill district is a hotch-potch of industrial and residential districts riven by highways. Mr Malinowski ran its police station before his promotion moved him downtown.

Colourful dots representing reported crimes freckle the map like psychedelic pimples. Adjacent to some of the dots are red squares. Each one represents a 250,000-square-foot (2.3-hectare) area that PredPol, a crime-prediction software used by the LAPD and at least 50 other law-enforcement agencies around the world, has flagged as being at risk of future criminal activity. Mr Malinowski says that, if he were still in charge of policing in Foothill, he would ask his officers to drive through those areas frequently, “so we’re there randomly—it throws the criminals off.” The idea is not to nab people red-handed, but to deter them through increased police presence.

PredPol is just one of a number of firms offering crime-prediction software to police forces. While the precise components of each firm’s algorithms probably differ, the broad idea is the same. They aim to help police allocate resources efficiently by using large amounts of data to predict (and therefore prevent) crime.

The use of algorithms to tackle complex problems such as urban crime, or to try to forecast whether someone is likely to commit another crime, is not inherently alarming. An algorithm, after all, is just a set of rules designed to produce a result. Criminal justice algorithms organise and sort through reams of data faster and more efficiently than people can. But fears abound: that they remove decisions from humans and hand them to machines; that they function without transparency because their creators will not reveal their precise composition; that they punish people for potential, not actual, crimes; and that they entrench racial bias.

Defenders of such programmes argue, correctly, that police have always relied on prediction in some form. Officers line parade routes, for instance, because experience has shown that the combination of crowds, alcohol and high spirits create an increased public-safety risk. Eliminating prediction from policing would produce an entirely reactive force. All these programs do, defenders say, is harness more data from more sources to help police make better decisions.

But the algorithms on which police base their decisions are, as far as the public is concerned, black boxes. The companies that create and market them consider their precise composition trade secrets. “Algorithms only do what we tell them to do,” says Phillip Atiba Goff of John Jay College of Criminal Justice in Manhattan. If their creators feed them biased data they will produce results infected with bias. And predictive policing is just one way in which the criminal-justice system is using algorithms to help them make decisions.

New Jersey uses an algorithm based on past criminal history, age, past failure to appear at trial and the violence of the current offence to determine whether someone is suitable for bail—that is, whether he presents too great a risk of flight or of committing more crimes while awaiting trial. Several states use algorithms to provide sentencing recommendations. At least 13 American cities use them to identify people likely to become perpetrators or victims of gun violence.

NYPD, too

The first time such approaches came to public notice was in the 1990s, when William Bratton introduced CompStat, a statistically driven management system, into the New York Police Department (NYPD), which he ran. CompStat involved regular meetings of commanding officers discussing prevention strategies and recent crime data from their precincts. As one former NYPD deputy commissioner says, CompStat encouraged police to ask, “What is the problem? What is the plan? What are the results to date?” and to use data to answer all of those questions.

But CompStat was largely reactive rather than predictive. It also used precinct-wide data, while software such as PredPol can target enforcement to specific blocks. Crime does not occur randomly across cities; it tends to cluster. In Seattle, for instance, police found that half of the city’s crime over a 14-year period occurred on less than 5% of the city’s streets. The red squares in Foothill cluster around streets near junctions to main roads—the better to burgle and run while homeowners are at work—as well as around businesses with car parks (lots of inventory, empty at night) and railway stations. Burglars who hit one house on a quiet street often return the next day to hit another, hence the red squares.

And, unlike CompStat, which used arrests as a measure of officers’ productivity, PredPol aims to prevent rather than punish crimes. “I’m more concerned about the absence of crime” than citations and arrests, says Mr Malinowski. “We don’t want mass incarceration for little crimes.” As for measuring productivity, that, too, has grown easier. LAPD patrol cars are geotagged, and the red boxes geofenced, so senior officers know precisely how long each car spends there.

Exactly what data get fed into the algorithms varies by company. Some use “risk-terrain modelling” (RTM), which tries to quantify what makes some areas crime-prone. One RTM algorithm uses five factors: prevalence of past burglaries, the residence of people arrested for past property crimes, proximity to main roads, geographic concentration of young men, and the location of apartment buildings and hotels. Some include requests for police help, weather patterns and the proximity of bars or transport stations. PredPol uses reported, serious crimes such as murder, aggravated assault and various forms of theft, as well as the crime’s date, time and location. Most of these algorithms use machine learning, so they are designed to grow more accurate the more predictions they make and the more data they take in.

Some analytic programmes suck in and link up more data. A joint venture between Microsoft and the NYPD called Domain Awareness System pulls data from the city’s thousands of publicly owned CCTV cameras, hundreds of fixed and car-mounted ANPRs, and other data sources. The NYPD says its system can track where a car associated with a suspect has been for months past, and can immediately alert police to any criminal history linked with a flagged number plate.

You have the right to remain silent

So do these algorithms work? Do they accurately forecast where crime will occur and who will go on to commit future crimes? Here the evidence is ambiguous. PredPol touts its 21-month-long trials in Kent, an English county, and Los Angeles, which found that the programme predicted and helped to prevent some types of crime (such as burglary and car theft) more accurately than human analysts did. A trial in Louisiana of a different data-driven predictive-policing model, however, found no statistically significant reduction in property crimes compared with control districts.

But even if such approaches proved effective beyond a doubt, concerns over their potential to trample civil liberties and replicate racial bias would remain. These concerns are most acute for algorithms that implicate people rather than places. The Chicago police department has compiled a “strategic subject list” of people it deems likely to be perpetrators or victims of gun violence (both groups tend to comprise young African-Americans from the city’s south and west sides). Its central insight parallels that of geographic predictions: a small number of people are responsible for a large share of violent crime. The department touts its accuracy. In the first half of 2016, it says, 74% of gun-violence victims and 80% of those arrested for gun violence were on the list.

Police say they update the list frequently. When someone new shows up on it, officers will sometimes visit that person’s home, thus promoting contact with police before a person has committed a crime. Nobody knows precisely how you end up on the list, nor is it clear how (short of being shot dead) you can get off it. One 22-year-old man, Robert McDaniel, told the Chicago Tribune that police came to his home and told him to straighten up—even though he had just a single misdemeanour conviction (he may have been earmarked because a childhood friend with whom he was once arrested was shot dead).

In a study of the first version of the list from 2013, RAND, a think-tank, found that people on it were no more likely to be victims of a shooting than those in a random control group. Police say the current list is far more accurate, but have still refused to reveal the algorithmic components behind it. And both Chicago’s murder rate and its total number of homicides are higher today than they were when police started using the list in 2013.

Meanwhile, algorithms used in sentencing have faced criticism for racial bias. ProPublica, an investigative-journalism NGO, studied risk scores assigned to 7,000 people over two years in Broward County, Florida, and found black defendants twice as likely as whites to be falsely labelled at high risk of committing future crimes. It also found the questions predicted violence poorly: only around 20% of those forecast to commit violent crimes actually did so. Northpointe, the firm behind the algorithm, disputed ProPublica’s findings.

But the questions on Northpointe’s risk-assessment form illustrate how racial bias can infect an algorithm even without any direct questions about race. It asked how often a defendant, his family members and friends have been arrested. Those numbers will presumably be higher in poor, overpoliced, non-white districts than rich ones. It also asked whether friends were in gangs, how often the defendant has “barely enough money to get by” and whether it is “easy to get drugs in your neighbourhood”—all questions that ethnic minority defendants will, on average, answer affirmatively more often than white ones. More broadly, a proprietary algorithm that recommends a judge punish two people differently based on what they might do offends a traditional sense of justice, which demands that punishment fit the crime not the potential crime.

Another analytical system, called Beware, assigns “threat scores” in real time to addresses as police respond to calls. It uses commercial and publicly available data, and it has a feature called Beware Nearby, which generates information about potential threats to police near a specific address, meaning officers can assess the risk when a neighbour calls the emergency services.

This raises privacy concerns, but it could cause other problems, too. For instance a veteran who has visited a doctor and taken medicine prescribed for PTSD, who also receives gun catalogues in the post, could be deemed high risk. Police might then approach his house with guns drawn, and it is not hard to imagine that kind of encounter ending badly. Such threat scores also risk infection with bad data. If they use social-media postings, they also raise free-expression concerns. Will police treat people differently because of their political opinions?

Questions of bias also surround place-based policing. Using arrests or drug convictions will almost certainly produce racially biased results. Arrests reflect police presence more than crime. Using drug convictions is suspect, too. Black and white Americans use marijuana at roughly similar rates, with the rate for 18- to 25-year-olds higher for whites than blacks. But blacks are arrested for marijuana possession at nearly three times the rate of whites across America—and even more often than that in some districts. Black people in Washington, DC, and Iowa are eight times likelier than whites to face arrest for marijuana. Charges for possession of that one drug comprise half of all drug arrests. Small wonder that a study by Kristian Lum of the Human Rights Data Analysis Group and William Isaac found that when a predictive algorithm was trained on historical drug-crime data in Oakland, California, it targeted black areas at twice the rate of white ones, and low-income neighbourhoods at twice the rate of high-income ones.

Place-based prediction also raises questions about reasonable suspicion. If police are on a residential block algorithmically predicted to be at risk of theft, and they drive past a man carrying a heavy satchel, does that justify stopping and searching him, especially when they might not do the same on another block?

Some accept that algorithms may replicate racial biases, but say they at least do not aggravate them. “It’s not a perfect world,” says one advocate of algorithm-based bail reform. You need to compare risk-based assessments with the status quo, he says. If a black and a white defendant came before a judge with the exact same record today, the judge might treat the black defendant worse. “At least with the risk assessment they’ll get the same score.” But that is a depressingly low bar to set.

Value-Based Care’s Landscape Tilt – Healthcare Informatics

When it comes to aligning physician incentives and processes in order to master value-based contracting in healthcare, senior leaders at the 30-plus-hospital, Arlington-based Texas Health Resources (THR) have been working assiduously to put everything together, confirms R. Todd Richwine, D.O., chief medical informatics officer at the Texas Health Physicians Group, the umbrella physician group attached to the THR integrated health system.

Referring to the Next Generation accountable care organization (ACO) program sponsored by the federal Centers for Medicare and Medicaid Services (CMS), Dr. Richwine notes that, “As part of the Next Gen ACO, we have 85,000” Medicare beneficiaries being managed under that program. “And we’re in multiple other ACOs,” he adds, referring to the 29,000 Medicare beneficiaries covered under the North Texas Specialty Physicians Medicare Advantage contract the system services, the 21,000 Medicaid beneficiaries covered under the AmeriGroup ACO, and, in the commercial realm, the 112,000 United HealthCare, 69,000 Aetna, 48,000 Cigna, and 19,000 Humana health plan members that THR clinicians are caring for, for a total of more than 354,000 patients altogether.

As for Texas Health Resources’ journey into value-based care delivery and payment, “I would say it’s been slow and steady,” Dr. Richwine says. “We started into ACO work years ago with a group called Medical Edge, which became Texas Health Physicians Group. So we have a long history of working towards quality and being physician-led. We have a large number of very active physicians—anything around reimbursement or quality measures, is, led by physicians,” he adds.

Asked about the biggest challenges, Richwine offers, “I think the biggest challenge has been developing the support staff necessary to do well on the measures. The physicians are very motivated to do the right things for their patients. Unfortunately, some of what’s required falls outside of what physicians can accomplish themselves. It really requires an entire multidisciplinary team to achieve results, on behalf of sometimes-difficult populations. We make sure that we’re providing the care—everything from physicians to PAs and nurse practitioners, to social workers and case managers.”

Richwine and his colleagues at Texas Health Resources are far from alone in facing down the complexities of value-based healthcare. As early on in the collective journey as the U.S. healthcare system is, the leaders of organizations like THR are learning valuable lessons in how to prepare broadly to take on financial risk in contracts with the public and private purchasers and payers of healthcare, and how to engage and align physicians.

Early On in the Journey of a Thousand Miles

Certainly, the grand adventure in value-based healthcare is in its early stages. Asked to characterize how far along on the proverbial journey of a thousand miles U.S. providers are in that journey, Joe Damore, vice president at the Charlotte-based Premier Inc., says, “I think we are still in the childhood years, about to enter the teenage years, maybe, on this. We’ve got about 20 percent of both commercial and Medicare arrangements, that are two-sided—it’s actually 17 percent in Medicare that are on two-sided risk now, up from 13 percent,” he adds. “We’re moving more and more towards two-sided risk, but it’s a slow process. The delivery systems want to make sure they’re ready, and that they have the tools, knowledge, and information to manage two-sided risk. But one-sided risk has continued to grow, with over 500 Medicare ACOs across the country, and about the same number of commercial ACOs.”

What’s more, Damore says, “Our forecast that two-sided risk will pick up even more soon, as providers need some downside risk to really focus on this. And we’ll also see a growth in Medicare Advantage and Medicaid managed care, involving some downside risk to providers. Many health plans want to do this and shift risk to providers, too, but many don’t have the infrastructure in place to move towards a capitated arrangement for primary care plus shared savings for specialty care; very few have built the tools to do that. There’s a desire to do that. Blue Cross of Hawaii has implemented that model, where they’re capitating over 500 physicians in Hawaii for primary care, and providing shared savings for total cost of care.”

“To add to what Joe has said, in the past many years, I’ve been focusing on the bundled payment side, and one of the interesting things with bundled payment is that if you go back to the BPCI (Bundled Payments for Care Improvement) program under Medicare, announced in 2011, the thousand-plus participants have been dealing with two-sided risk across the life of that program,” says Mark Hiller, vice president of bundled payment services at Premier. What’s more, Hiller notes, “There’s two-sided risk in the total joint replacement program. We’re working with providers on CHF, pneumonia, COPD, and so forth, in addition to total joint, and with two-sided risk. So that’s growing, if more slowly, than the broad population health-related programs. But in the bundle area, it’s growing fast. I wouldn’t be surprised if the new program took off quickly, he says, referencing what’s called BPCI-Advanced—the Bundled Payments for Care Improvement Advanced program, sponsored by CMS.

A fundamental challenge, says their colleague, Shawn Griffin, M.D., Premier Inc.’s vice president, clinical performance improvement and applied analytics, is that, collectively, “We’re still standing on opposite sides of the gym, trying to figure out if anyone wants to dance, and nobody’s a good dancer yet.” “Living in both worlds—the world of fee-for-service incentives and of value-based payment—is very difficult,” Damore agrees. “We have an expression” at Premier, Damore says: “fee-for-service is the enemy of value-based payment. That is the number-one challenge, that the model of payment to providers has to be aligned on a value-based model.” With reference to discussions about capitated payment in primary care, he says, “If my incentive is to get patients into the office, that’s what I’m going to do; but if my incentive is to manage my patients on a capitated model, I’m going to grow my panel and have advanced practitioners see my patients more.”

“I think the provider market is moving forward at a measured pace,” says Laurie Sprung, Ph.D., vice president, consulting, at The Advisory Board Company, the consultative firm based in Washington, D.C. “But,” Sprung says, referencing the recent announcements around potentially disruptive market changes on the horizon—from the proposed CVS takeover of Aetna, to the announcement earlier this year that Amazon, Berkshire Hathaway, and JP Morgan Chase are planning to create an employer-based disruption in the healthcare delivery market—“we all recognize that the opportunity for disruption is there. And part of what I see in all those areas is a wide-open space that many of our progressive members are in, where they’re on a value path, but they’re also on a consumer-focused path. What good is a clinically integrated network, if I can’t access it easily?” she asks, referring to the acceleration taking place in the development of clinically integrated networks in many local and regional healthcare markets across the U.S. Simply building new organizations of organizations won’t move providers forward fast enough, she adds. “You have to be willing to disrupt how you do things. It’s incredibly disruptive for a physician practice to do some visits live and some by telehealth; it’s hard operationally. But we’ve got to think through what the package is, ours or somebody else’s, that gets the consumers what they want when they want it, and how they want it. There are already a lot of low-cost alternatives to seeing a physician,” she notes.

The View from the Health Plan Side

All those interviewed for this article agree that it’s very important for the leaders of hospitals, medical groups, and integrated health systems to understand where the leaders of commercial health plans are moving, within leaders of more innovative health plans are moving ahead as quickly as possible.

One health plan whose senior executives would like the market to move forward faster than it is, is the San Francisco-based Blue Shield of California, which serves more than 4 million members in California. There, notes Chris Jaeger, M.D., Blue Shield’s vice president of accountable care innovation and clinical transformation, the organization’s entire provider-facing strategy is based on the presumption of value-based healthcare as a core building block. “We start with a base commercial contract with the providers, and then we lay on top of that a multi-party accountable care/HMO agreement,” in every instance possible, Jaeger says. “That involves Blue Shield, a provider group, and two facilities. And we set a global budget for the population attributed to that contract, and if they perform well and there are savings, we all share the savings, and if we come in over the budget, we share in the costs as well. The global budget helps us keep the premium rise at a better level than the non-HMO market,” he adds.

What’s more, Jaeger notes, Blue Shield of California’s accountable care contracts also include “quality parameters around which providers can see extra compensation as well. Where I see us going is in extending that kind of agreement to other stakeholders, including skilled nursing facilities and other stakeholders important to the continuum of care,” he says. “Beyond that, we’re working on alternate payment models separate from ACO arrangements. Those include bundled payments, where we create the bundles for certain conditions, procedures, or diseases, with clear guiderails or boundaries. For example, if we’re looking at a bundled payment for an elective hip replacement, those guiderails help us innovate with providers, around things they’re working on, so it makes sense for them and for us. So for instance, on that hip replacement, would there be an opportunity to collaborate with the physical therapists, rehab facilities, and even startups in the space, to use more technology in the home, versus building more bricks and mortar facilities.”

Jaeger emphasizes that the opportunity to collaborate to improve patient outcomes is one that he believes provider leaders should find appealing, in that collaboration to improve outcomes is a banner under which everyone can march together. Still, he acknowledges, the leaders of many patient care organizations are finding the path forward challenging. Asked whether the provider organizations partnering with Blue Shield of California have been achieving improved clinical and financial outcomes, Jaeger says, “Of course, it varies by provider. What matters first and foremost is their willingness and their skill; those are critical success factors. Another relates to how many patients are involved, both at the group level and the individual-physician level. The more at-risk members, the greater the likelihood to improve outcomes. And another factor relates to how long they’ve been working with us. Those groups that have been doing this for a longer time also have the benefit of Blue Shield investing in efforts around complex care management, certain types of clinics, etc.”

From the Physician Side: Slow, Steady Acceptance and Progress Seen

Meanwhile, leaders of multidisciplinary physician groups are working hard to make progress, even as they face obstacles and challenges. Still, confusion remains among leaders and clinicians at many patient care organizations about how various alternative payment models work, even bundled payments. Premier’s Hiller says, “There’s still a misunderstanding out there about how these models work. When I meet to talk with providers about bundles, especially when I talk to providers new to the concept, there’s a vast different between how the program actually works, and their understanding of it. There’s this notion that you’ll get a single payment and figure out how to distribute that, and many providers aren’t comfortable with that, because they think they’ll now have to act as a payer. But the vast majority of bundles today still involve a fee-for-service payment upfront, and it can actually be successful for them financially. To be successful with these bundles really requires managing quality of care: you don’t want multiple readmissions under a bundle. When you can reduce readmissions, and look across the continuum of care, which most providers aren’t used to doing, you can be quite successful, both financially and from a quality perspective,” he emphasizes.

In the end, a classic differentiator remains operational here, says Sohail, the CIO of the Dallas-based Premier Management Company, a firm that organizes and manages ACOs (and is unrelated to the Charlotte-based Premier Inc.). And that differentiator, says Sohail (he uses one name only), is the element of personal leadership. “Looking nationwide,” Sohail says, “it is clear to me that, among those organizations that have had tremendous success in value-based healthcare, personal leadership is, in my humble opinion, the primary trait. When leadership understands why it’s important to move into value-based care, that’s when things move faster. And there is a learning curve involved. It’s better to fail earlier. So in my opinion, personal leadership is core. Secondly,” he says, “with the bureaucracy of large organizations, any set of changes has to go through so many iterations—and that can only be addressed through strong leadership.”

Blockchain technology projects will be shelved from 2018: report – ETCIO.com

Bangalore: The hype surrounding blockchain technology will recede sharply in 2018 as the cost and complexity of implementing blockchain solutions becomes apparent, according to GlobalData’s Thematic Research report

Many of the early blockchain projects will either be quietly shelved in favour of more traditional approaches or they will evolve in a way which reduces their dependence on blockchain technology.

GlobalData’s Thematic Research report, ‘Blockchain – Thematic Research’, reveals that while the market is awash with absurd claims about the benefits of blockchain technology, there are some key domains where the ability to execute distributed transactions without relying on a single central authority will bring significant value. While blockchain technology will have lost much of its gloss by 2025, it will have found its way into the heart of many key business processes; especially those involving multiple, disparate, participants.

Blockchain is an electronic ledger of transactions that are continuously maintained in blocks of records. The ledgers are jointly held and run by all participants. Coupled with cryptographic security, this makes them tamper-proof (at least in theory).

The Most Powerful Women in Healthcare IT for 2018—CIOs and IT execs – Health Data Management

Title: CIO

Organization: HCA, West Florida Division

Years in HIT: 29

Previous Positions: IT Division Director, HCA; IT Director, HCA Regional Medical Center, Hudson, Fla.; Director of IT, Lenox Hill Hospital; Consultant, E.C. Murphy, Yuhasz Consulting.

Significant Achievements: Coming in as the IT director of one HCA facility in 1996, she now has executive IT responsibilities for 16 hospitals in HCA’s West Florida division.

Impact on HIT: McFarland has led the organizations in her division to implement electronic health records, while managing a wide range of other tasks for HCA’s West Florida Division. She also is involved in a number of professional associations, particularly CHIME and HIMSS.

The Pentagon has a ‘major’ automated information systems problem – C4ISRNet

The report found that the policies dictating the Defense Department’s 10 MAIS business programs do not meet industry standards for providing performance data. These standards ensure that projects are keeping to their initial cost estimates, schedules, and performance goals. An IT program is designated as a MAIS when its single year costs exceed $40 million, its total acquisition costs exceed $165 million, or its total life-cycle costs exceed $520 million.

Technology that changed us: The 1990s, from WorldWideWeb to Google – ZDNet

WorldWideWeb showing many of its functions.

(Image: Tim Berners-Lee for CERN/Public domain)

In this 50-year retrospective, we’re looking at technologies that had an impact on the world, paved the way for the future, and changed us, in ways good and bad.

Previously, we explored the 1980s. Now we continue our time travels in the 1990s.

1990: WorldWideWeb, the first Web browser

Of all the technologies that changed our lives, perhaps the most profound of the last 50 years has been the web. But it wasn’t the ability to hyperlink documents that made the most impact. Instead, it was the application that presented all that information to users, the browser.

Read also: No internet: The unbearable anxiety of losing your connection

The browser, in combination with the various web protocols, allowed access to the web from a wide variety of operating systems and devices. It allowed untrained users to click and browse from website to website. But even before there were public websites, there needed to be a browser.

That browser was initially called WorldWideWeb. It’s name was later changed to Nexus to avoid confusion with the entity we now call the web, but back then was the World Wide Web or WWW. The web changed the world, but it was the browser that delivered those changes worldwide.

Runner up: Windows 3.0.

1280px-linux012.jpg1280px-linux012.jpg

Floppy disks holding a very early version of Linux.

(Image: Shermozle/Creative Commons Attribution-Share Alike 3.0 Unported license)

1991: Linux

We’re now into the 1990s and technology change is accelerating. The first website went online at CERN. In fact, so much happened that we have a few articles devoted to 1991 alone. But of all the innovations, of all the products launched, one stands out: Linux.

CNET: Nintendo Switch turned into Linux tablet by hackers

But it was the message sent out on August 25, 1991 to the Minix Usenet newsgroup that changed everything. Linus Torvalds typed, “I’m doing a (free) operating system (just a hobby, won’t be big and professional…” Ah, Linus. You got so much right, but you got the scale of Linux’ eventual impact so very wrong.

Linux took UNIX and blasted it out of existence. Instead of a very expensive-to-license operating system, Linux was free. It fired up open source. And today, Linux runs in everything, from light bulbs to cars, to almost all TVs and phones on the market.

Runners up: A lot, plus the first website.

1992: The first SMS text message

Who would have thought that people would prefer typing over talking on their phones? While the SMS concept had existed for quite some time, it wasn’t until December 3, 1992 that engineer Neil Papworth sent a message to Richard Jarvis’ Vodafone Orbitel 901 handset. The message that precipitated billions of very sore thumbs was a simple “MERRY CHRISTMAS”.

Read also: Facebook was tracking your text message and phone call data

At the top of its usage curve, US cell phone customers sent 2.3 trillion SMS messages. But as this chart from Statistica shows, SMS volume has been going down steadily as users migrate to app-based message from Apple, WhatsApp, and Facebook. Even so, SMS changed how we talk, or rather, not talk to each other.

Runners up: Windows 3.1, first ThinkPad.

04mosaic.jpg04mosaic.jpg

Mosaic was the first widely-popular graphical web browser.

Image: National Center for Supercomputing Applications (NCSA) and the Board of Trustees of the University of Illinois

1993: Mosaic web browser

By 1993, things were heating up for the World Wide Web, which was quickly becoming actually worldwide. While Mosaic wasn’t the first browser, it was the first that could display images. For the time, it was very fast, and it quickly became popular.

Mosaic, created by Marc Andreessen and Eric Bina, grad students at the National Center for Supercomputing Applications (NCSA) located at the University of Illinois Urbana-Champaign. Mosaic eventually became Netscape, which dominated the web (for a while, at least).

Runners up: Windows NT, Myst, DOOM, plus the first webcam to improve caffeine intake efficiency (technically, the Internet of Things was born here, as well, and, as it should be, it was all because of coffee).

1994: Amazon founded

At the time of its founding back in 1994, no one could have know that Amazon would become one of the world’s most innovative companies. Then, it was a source for books.

Read also: Amazon’s Jeff Bezos was, for a day, the world’s richest man

Today, it’s at the core of the cloud movement, has played a primary role in killing off retail (or at least beating retailers who weren’t on their best game), has revolutionized digital books, transformed product availability and delivery, created an AI that lives in our homes, and has become a prime producer of top-tier original video content.

Runners up: Sony’s first PlayStation, PHP, and, sadly, banner ads.

1995: Windows 95 and IE 1.0

By 1995, Windows had been around for a full decade. But it was in 1995 that what became the dominant desktop environment for the next two decades would be introduced. While a new Windows 10 user or Mac OS user might not know how to use Windows 3.1 on sight, every modern desktop computing user would know how to use Windows 95.

Windows 95 was the first version of Windows to include IE, which would become the dominant browser for more than a decade. While network configuration in Windows 95 was still uncomfortable, with Windows 95, Microsoft finally had the foundation for what would become the modern desktop experience.

Runners up: JavaScript and SSL.

palmpilot5000eu.pngpalmpilot5000eu.png

An early model — the PalmPilot Personal.

(Image: Channel R/Creative Commons Attribution-Share Alike 3.0 Unported license)

1996: Pilot handheld (first Palm handheld)

At the time, it was hard to believe a modem company would introduce the first successful handheld PDA. Now, of course, with handheld smartphones dominant, it’s impossible to separate communications from personal devices.

Read also: Today’s cheapest iPhone vs the PalmPilot

1996 also gave birth to the USB and CSS. These have had their impact on technology, but it was the small, portable, relatively inexpensive Pilot handheld that replaced personal organizers and was the first device, since the watch, that came with us everywhere.

Runners up: USB, CSS, and IPv6.

1997: Steve Jobs returns to Apple

A lot went on in 1997, but the single biggest event, arguably the one that changed all of technology, was the return of Steve Jobs to Apple.

CNET: Steve Jobs’ 1973 job application up for auction

You have to remember that in 1997, Apple was dying. It was always described as “the beleaguered Apple Computer” or “the troubled Apple Computer.” No one would have expected Apple to utterly transform music and telephones, not to mention lead the digital mobile transformation we’re experiencing now.

One more thing: It could be argued that other companies would have created mobile devices, but it was the force of Jobs’ personality and his steadfastness of purpose that overcame the impenetrable blocades and old style of business practiced by mobile operators. Sure, we would have had smartphones. But smartphones would not be what they are, the dominant technology worldwide.

Runners up: Netflix founded and Wi-Fi 802.11 standard adopted.

1998: Google founded

If you’re not sure about the impact of Google on modern times, Google it. For the early years of the web, search engine wars dominated the news. Then came the Google algorithm, famous for surfacing much more relevant information.

Read also: Google erases ‘Don’t be evil’ from code of conduct after 18 years

Somehow, a page that was simple and barebones eclipsed all other advertising, determined what was relevant to… everything, and became the dominant information verb in our lives. Founded with the motto “Don’t be evil,” it’s not at all clear whether Google will be our constant assistant and friend, or our ultimate undoing.

Runners up: Windows 98 and first iMac introduced.

applegraphiteairportbasestationfront.jpgapplegraphiteairportbasestationfront.jpg

Original (Graphite) AirPort Base Station.

(Image: Jared C. Benedict/Creative Commons Attribution-Share Alike 2.0 Generic license)

1999: Apple AirPort (and iBook)

Apple has a habit of taking existing technologies and molding them into something irresistible to consumers. Along the way, Apple has often set the pace, effectively giving other companies “permission” to enter similar markets.

CNET: Apple will stop selling its AirPort routers when supplies run out

While neither the 1999 Airport Wi-Fi access point nor the easy-to-mock clamshell design of the Apple iBook were barnburners, they showcased one feature that has changed computing. Before the AirPort (and Wi-Fi), computers were always tethered. If you wanted to access a network, you had to plug in. But with the advent of Wi-Fi, we could take our machines anywhere in the home or office, without wires.

The AirPort showed it was possible, and the entire world followed.

Runner up: BlackBerry and preparing for the Y2K bug.


Next up… the 2000s

Go back to… the 1980s

Why Today’s Business Schools Teach Yesterday’s Expertise – Forbes Now

As the world undergoes a Fourth Industrial Revolution that is “fundamentally altering the way the way we live, work, and relate to one another—in its scale, scope, and complexity, a transformation … unlike anything humankind has experienced before”—one might imagine that business schools would be hotbeds of innovation and rethinking, with every professor keen to help understand and master this emerging new world.

Paradoxically, it’s the opposite. For the most part, today’s business schools are busy teaching and researching 20th century management principles and, in effect, leading the parade towards yesterday.

The Baker Library at the Harvard Business School (AP Photo/Charles Krupa)

Take for instance this first-hand account just published in The New Republic by John Benjamin, an MBA student and Dean’s Fellow at the M.I.T. Sloan School of Management.

MBA programs are not the open forums advertised in admissions brochures… Business school instruction is routinely blinkered… An MBA class will consider a business issue… in isolation. Its challenges are delineated; its society-level implications are waved away. The principals’ overriding goal—profit maximization—is assumed. With mechanical efficiency, students then answer the question of how to move forward. Individual choices are abstracted into numbers or modeled as graphs…

As Sarah Murray has written in the Financial Times: “While there is growing consensus that focusing on short-term shareholder value is not only bad for society but also leads to poor business results, much MBA teaching remains shaped by the shareholder primacy model.”

The Four Industrial Revolutions

“The First Industrial Revolution used water and steam power to mechanize production. The Second used electric power to create mass production. The Third used electronics and information technology to automate production. Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century. It is characterized by a fusion of technologies that is blurring the lines between the physical, digital, and biological spheres.”

-Klaus Schwab, Founder and Executive Chairman, World Economic Forum Geneva

A Vast Societal Drama

In the Fourth Industrial Revolution, a vast drama is now playing out in society, affecting almost everyone. In its essence, the Revolution is very simple: organizations are connecting everyone and everything, everywhere, all the time. They are becoming capable of delivering instant, intimate, frictionless, incremental value on a large scale. They are creating a world in which people, insights, and money interact quickly, easily and cheaply.

What is enabling the Revolution is that rarest of things, a genuine paradigm shift in management. Firms leading the Revolution are being run very differently from the unwieldy industrial behemoths of the 20th century. They are focused on continuous innovation for customers and organized to be nimble, adaptable, and able to adjust on the fly to meet the shifting whims of a marketplace driven by end-users. Think Amazon, Apple, Facebook, Google, Microsoft, Alibaba, Airbnb, Etsy, Lyft, Menlo Innovations, Saab, Samsung, Spotify, Tencent, Tesla, Uber and Warby Parker. In these firms, profits are the result, not the goal, of the enterprise. For them, the future is thrilling and uplifting.

For those companies that continue to be run like the lumbering 20th century mastodons, based on profit maximization and a philosophy of controlism, the situation is very different. The examples here are also abundant. “Market-leading companies,” as analyst Alan Murray has written in the Wall Street Journal, “have missed game-changing transformations in industry after industry—computers (mainframes to PCs), telephony (landline to mobile), photography (film to digital), stock markets (floor to online)—not because of ‘bad’ management, but because they followed the dictates of ‘good’ management.”

In effect, the “’good’ management of yesterday” that these firms are practicing—profit maximization and a philosophy of controlism—is obsolete. It was a relatively good fit for much of the 20th century. But then the world changed, and ‘good’ management began to falter. It couldn’t cope with the fast pace and complexity of a customer-driven marketplace. Yet this “‘good management of yesterday” is, by and large, what is being taught in today’s business schools.

Emmanuel Macron, at a conference at the Ecole Superieure des Affairs (AP Photo/Bilal Hussein)

Let’s be clear: the difference between leaders and losers isn’t a matter of access to technology or big data or artificial intelligence. Both the successful and the unsuccessful firms have access to the same technology, data and AI, which are now largely commodities. Traditionally-managed firms use the same technology and data but typically get meager results. It’s not technology or data or AI that make the difference. The difference lies in the nimbler way these firms deploy technology, data and AI.

Despite individual thought-leaders in business schools, there has been little change in the core curricula of business school teaching as a whole. The disconnect between what is taught and the vast ongoing societal drama under way continues. And it’s difficult to discuss, because it puts in question careers, competencies, job tenure, values, goals, assumptions of the entire business-school world and more.

Individual Innovators

It’s not that individual business school professors haven’t grasped and made the case for change. Among many others:

  • In 2010, business guru and former business school dean Roger Martin at the Rotman School of Business hailed the advent ofcustomer capitalismover shareholder primacy.
  • More recently, two distinguished Harvard Business School professors–Joseph L. Bower and Lynn S. Paine—declared inHarvard Business Reviewthat profit maximization is “the error at the heart of corporate leadership.” It is “flawed in its assumptions, confused as a matter of law, and damaging in practice”, and in effect, “pernicious nonsense.” Yet business schools press ahead with core curricula based on this error, seemingly impervious to issues.
  • In 2013, professor Rita McGrath at Columbia Business School challenged theconventional business school theology of sustainable comparative advantage.
  • In 2014, professor William Lazonick at the University of Massachusetts Lowell won HBR McKinsey Award winner for the best HBR article of 2014. Lazonick has courageously led the charge in identifying the problems inherent in maximizing shareholder value through massive share buybacks.
  • In 2014, professor Clayton Christensen and his colleague Derek van Bever questioned the orthodoxies governing finance and called for a modern-day Martin Luther to articulate the change.
  • Business schools also support many technology centers and small programs, such as Jay Goldstein’s program at Northwestern’s McCormick School of Engineering on radical management and Susan George’s program onAgile Management at UC Berkeley Extension.

Yet these innovations have not led to basic change to the core curricula in most business schools.

Agile Management Is Now Mainstream

Until recently, the lack of familiarity in business schools with the new way of running organizations could in part be excused because the management expertise itself was still somewhat obscure. There was little awareness of it outside software development, and general management thinkers had little respect for management ideas coming from software developers.

That’s no longer the case. The paradigm shift in management is no longer a secret: for instance, right now it’s on the cover of the May-June 2018 issue of Harvard Business Review. It’s the subject of scores of articles in Strategy & Leadership under the bold editorship of Robert Randall. There are also many books written about it, including a recent book co-authored by the managing partner of McKinsey, Talent Wins.

But it is very different. The new management isn’t simply a new training course, or a process, or a methodology or an organizational structure that can be written down in an organizational manual and simply added to the ongoing agenda. It’s a different mindset with counterintuitive ideas that fly in the face of the assumptions of a “good” 20th century manager or the typical business school case.

  • Managers can’t tell people what to do;
  • Control is enhanced by letting go of control;
  • Talent drives strategy.
  • Dealing with big issues requires small teams, small tasks, small everything;
  • Complex systems are inherently problematic, and must be descaled;
  • Companies make more money by not focusing on money,

Internalizing these counter-intuitive ideas and making them part of the culture of an organization—including a business school— doesn’t happen easily or quickly, particularly with people steeped for decades in 20th century thinking.

Interestingly, business itself is beginning to grasp that something is up. Surveys by both Deloitte and McKinsey indicate that over 90% of senior executives want to master new, more flexible business practices, even if less than 10% see their current organization as highly agile.

Meanwhile, business schools keep churning out more standard MBAs, steeped in yesterday’s methodologies—truly excellent executives for the 20th century. Today’s business schools thus resemble medical schools teaching pre-penicillin medicine.

A Treadmill Of Irrelevance

There are a number of elements that keep business schools stuck on an obsolete institutional treadmill:

  • There is still a large demand for 20th century management. Many managers in large firms have spent decades implementing the doctrines of profit maximization and controlism. They share the same assumptions, goals, values and attitudes. And so their firms continue to be run that way, despite the performance problems it causes. So it is not surprising that these executives seek recruits who have the same mindset. Hence there is still continuing demand for MBA graduates schooled in 20th century thinking.
  • It isn’t obvious that prior management experience plays a large role in becoming a business school professor. The professors usually have no management experience in 20th century management, let alone in firms implementing the new paradigm.
  • It appears that careers in a business school depend more on research than on teaching.
  • The accreditation process of business schools guarantees glacial change to core curricula. It takes around five years to have even a small change to the core curriculum accepted by the accreditation process. (One highly-successful dean admits with frustration that 15 years of strenuous effort resulted in only two minor changes in the core curriculum.)
  • The widespread use of $200 textbooks suggests financial interests of faculty also prevent change.
  • In some cases, business schools also perform a cash-cow function for the rest of the university, as the business school attracts wealthy students from overseas. The risk of losing this revenue stream induces caution in changing a money-making degree.
  • In the current setup, what is taught by a business school hardly matters. That’s because business schools mainly perform a “filtering” function (“selection of the sharpest analytic minds”) rather than a teaching function (“what is good management practice for a 21st century firm?”)
  • The business school has little need to concern itself with value to the end-users—the students. The inherent learning an MBA provides matters less than the high-salaried job-offers that it leads to. Indeed, the high-cost of an MBA is a feature more than a bug. It’s part of the MBA’s mystique in the marketplace: “Anything that expensive must surely be valuable!”

Root Cause: Misuse Of the Scientific Method

How did such an unproductive system come about? Many of today’s business school problems date back to 1959, when the Ford Foundation published the Gordon-Howell Report which lambasted the unscientific foundation of business education. In the same year, the Carnegie Foundation also published an analysis with an equally harsh message:

The result? Half a century of business school research that aspired to be “scientific.” The problem is that in a field of human activity that is undergoing dramatic change findings that can be proven to be universally true by double-blind scientific experiments turn out to be of little practical utility. Research therefore came to be evaluated on the elegance and rigor of the experimental design more than on the utility of the findings. The fact that few, if any, business people ever tried to read, let alone implement, the research was considered irrelevant. Business school research is an enclosed self-referential world—academics writing for other academics. The utility of the entire research enterprise is not a fit subject for discussion.

What went wrong here? Roger Martin argues persuasively that the Ford Foundation made a fundamental conceptual error. It was seeking scientific findings in a subject where science methodology can’t be applied:

Aristotle was a proponent of his scientific logic, and in the best scientific tradition, he established boundary conditions for his theory. It was for the part of the world in which things could not be other than they are. An oak tree is an oak tree and cannot be something else. A piece of granite is a piece of granite and can’t be something else. For this world, Aristotle laid out the seminal scientific method and argued that it was the optimal way for understanding that part of the world.

As Aristotle laid out his scientific method for the part of the world that cannot be other than it is, he also cautioned that there is another part of the world that can be other than it is, and there was another method that needed to be used to understand it. Here the scientific method would be wholly inappropriate.

That part of the world consists of people – of relationships, of interactions, of exchanges. In this part of the world, relationships can be good, bad or indifferent; close, distant or sporadic. They change – they can be other than they currently are. For this part of the world, Aristotle said that the method used to develop our understanding and to shape this world is rhetoric; dialogue between parties that builds understanding that actually shapes and alters this part of the world…

[The Ford Foundation thus] embraced Aristotle’s science, but ignored his boundary condition. They pushed the scientific method past the limits for which it was designed. It was as if they adopted Aristotle’s tool and then ignored his user manual.”

These Problems Are Old News

Not surprisingly, the irrelevance and obsolescence of business school thinking isn’t really news. Five years ago, Harvard Business School professor Rakesh Khurana offered a scathing critique of business schools in strategy + business.

“Business schools are facing “a crisis of irrelevance.” Business schools are “in an incredible race to the bottom.”

It’s “not even clear what an MBA consists of anymore.” There is a “lack of quality and consistency in the development of general management knowledge.”

The incentives are based on “research and academic credibility, not producing management knowledge.” As a result, “very little general management knowledge” has been produced by business schools.

As a result, “business schools have lost the place where we could be turned to as a source of basic research and basic knowledge. Very few businesses turn to us. They turn to other sources of knowledge, such as consulting firms, instead.” An MBA is now “a highly valued credential, but you’re not going to learn much along the way.”

Lesser-Known Business Schools Are In Decline

In one sense, business schools are one of the success stories in higher education over the last century. Worldwide, there are many thousands of public, private, and foundation institutions providing education in the field of business and management.

The future however is unpromising for the lesser known schools. A number of planned or actual business school closures have occurred at the University of Wisconsin, the University of Iowa, Virginia Tech, and Simmons College.

There are various factors in play. Non-MBA graduate business degrees are proliferating; fewer international students are opting for American schools; undergraduate student debt is ballooning; and online courses including massive open online courses (MOOCs) are offering cheaper alternatives to the learning aspect of MBAs, although without the prospect of a lucrative job offer.

Yet not all is gloom. While applications to business school overall have declined over the past three years, the top-ranked business schools are seeing an influx of interest. This has less to do with the knowledge being imparted than to the high-salaried job prospects that the celebrity business schools offer.

Business Should Be Taught By Business People

In 2013, Larry Zicklin, a former chairman of the Wall Street investment firm Neuberger Berman, a professor at New York University’s Stern School and a lecturer on ethics at the Wharton school at the University of Pennsylvania, made a revolutionary proposal to fix business schools. Business school teachers, he said, should teach business.

Academics at business schools now spend a lot of their time doing research for academic journals that is of little practical relevance and that nobody even reads. ‘Academics do it,’ Zicklin told the Financial Times, “because they are interested in doing it, but it’s also the mechanism by which they get promoted and secure tenure. Because research is an important part of business school rankings, it has created the value system by which academics are rewarded. Research adds so much to the cost of education, especially at business school. But the evidence about research also suggests that most people can’t understand it.”

Although Zicklin’s recommendations have not been widely accepted, there are exceptions. Hult International Business School, for instance, aspires to be relevant to business with a curriculum stressing real-world experience the world and faculty who have management experience.

CEO of TechnoHome delivers the keynote address for Hult International Business School, (Josh Reynolds/AP Images for Hult International Business School)

Three Scenarios

What is required is a radical rethink of the business school system. The challenge is to build business schools that are as innovative as they are efficient, as forward-looking as they are pragmatic, as passion-filled as they are intellectually rigorous and as valuable as their enrolment fee. This is not merely about adding on a new course, practice, process or structure to the current modus operandi. Instead, business schools have to rethink their existence from top to bottom.

One can imagine three possible scenarios:

Bold change: It’s possible that a few exceptional pioneers will have visionary leadership that is ready to build the business school of the future from first principles. This would entail having business people teach business, abandoning the current accreditation system, setting aside tenure, recruiting a whole new set of professors with different skill-sets and embracing agile management from top to bottom.

Glacial change: For most schools, the most likely outcome will be continuing glacial change with reform being pursued within the current assumptions of business schools. A few minimal adjustments at the margin may occur every decade or so until the schools are disrupted and put out of business.

Agile organic change: Some schools may be willing to adopt a systematic approach to basic change, starting small and proceeding with organic change in an agile fashion,

  • They would need guidance and support from an enlightened leadership, aimed at establishing the case and the basis for change. They would need to focus primarily on delivering real value to students.
  • The work should begin with a very rapid inquiry into how it has come to pass that business schools are operating in such an obsolete fashion. How could such a thing have happened? The work should be willing to discuss the undiscussable, including the validity and utility of “scientific” research in business, career structures, the accreditation system, entrenched financial interests and more. The work should be aimed at trying to get a substantial body of opinion that agrees on the depth and breadth of the problem.
  • The work should proceed in an agile fashion identifying experiments to deal with the challenge, even small steps.
  • The work should focus on opportunities, as well as problems. Surveys show that over 90% of senior executives want to master new, more flexible business practices, but less than 10% see their own organization as currently having mastered and implemented such practices. That 80% gap between aspiration and performance is a huge opportunity for business schools. Consulting firms like Deloitte and McKinsey are themselves rapidly gearing up to meet the need. If business schools don’t see and grasp the opportunity, the risk of disruption is even greater. It would be a pity if business schools miss the boat.

And read also:

Explaining Agile

Why Agile Is Eating The World

The World’s Dumbest Idea: Maximizing Shareholder Value

Why aren’t business schools more business-like?

Getting down to Business in the Business Schools

What’s Wrong with Today’s Business Schools?”

Note: This article draws on helpful discussions with Johan Roos, John Benjamin, Jay Goldstein and Peter Stevens.

Note: The original version of this article incorrectly included Wake Forest University in a list of business schools facing planned or actual closure. Wake Forest continues to operate, while changing the delivery of its MBA program, and moving away from a full-time daytime offering to focus on its evening and weekend programs.

The Birth Of GDPR: What Is It And What You Need To Know – Forbes Now

Shutterstock

When you woke up this morning, you may have noticed that your email inbox has been flooded with emails from businesses and organizations informing you that they have “updated their privacy policy”.

The reason being is today, GDPR goes into effect and if a business isn’t compliant, then hefty fines and penalties await.

What Is GDPR and Why Is It Necessary?

The General Data Protection Regulation (“GDPR”) is a legal framework that requires businesses to protect the personal data and privacy of European Union (EU) citizens for transactions that occur within EU member states. It covers all companies that deal with the data of EU citizens, specifically banks, insurance companies, and other financial companies.

The 1995 Data Protection Directive

In April 2016, the European Parliament adopted the GDPR, replacing its outdated Data Protection Directive, enacted back in 1995. Unlike a regulation, a directive allows for each of the twenty-eight members of the EU to adopt and customize the law to the needs of its citizens, whereas a regulation requires its full adoption with no leeway by all 28 countries second. In this instance, the GDPR requires all 28 countries of the EU to comply.

The issue with the Directive is that it’s no longer relevant to today’s digital age. Its provisions fail to address how data is stored, collected, and transferred today—a digital age. Like many regulations and statutes throughout the EU and U.S., these regulations haven’t been able to keep up with the pace of the levels of technological advancement.

Exploring the GDPR

The full text of GDPR is comprised of 99 articles, setting out the rights of individuals and obligations placed on businesses that are subject to the regulation. GDPR’s provisions also require that any personal data exported outside the EU is protected and regulated. In other words, if any European citizen’s data is touched, you better be compliant with the GDPR. For example, a U.S. airline is selling services to someone out in the UK, although the airline is located in the U.S., they are still required to comply with GDPR because of the European data being involved.

It is a very high standard to meet, requiring that companies invest large sums of money to ensure they are in compliance. According to the EU’s GDPR website, the legislation is designed to “harmonize” data privacy laws across Europe, providing greater protection and rights to individuals.

Before the Internet, Europe has long been the model for how our data should be protected and regulated. The reason is that the public’s concern over privacy has dominated the business sphere, ensuring that stringent rules on how companies use the personal data of its citizens is always taken into account.

Two days ago, the UK government created and enacted a new Data Protection Act, replacing the previous law that was passed into law back in 1998. Running 353 pages and full of complex provisions, it largely incorporates all the provisions of GDPR, but differs in that individual countries were able to select parts of GDPR that could be customized to their citizen’s needs.

After months of learning about data breaches from companies like Facebook and Equifax, this couldn’t be more necessary. Even Mark Zuckerberg jumped on board in his testimony before Congress on Capitol Hill, believing GDPR to be a very positive step for the Internet.

What Data Is Protected Under GDPR?

With the enactment of GDPR today, two major protective rights should be highlighted. First, the right of erasure, or the right to be forgotten. If you don’t want your data out there, then you have the right to request for its removal or erasure. Second, the right of portability. When it comes to “opt-in/opt-out” clauses, the notices to users must be very clear and precise as to its terms.

GDPR requires clear consent and justification. Pursuant to the GDPR, the following types of data is addressed and covered:

(1) Personally identifiable information, including names, addresses, date of births, social security numbers

(2) Web-based data, including user location, IP address, cookies, and RFID tags

(3) Health (HIPAA) and genetic data

(4) Biometric data

(5) Racial and/or ethnic data

(6) Political opinions

(7) Sexual orientation

What Criteria Needs To Be Met?

As mentioned earlier, the GDPR requirements comprise of a total of 99 articles–that’s alot of reading. Any company that stores or processes personal information about EU citizens within EU states must comply with the GDPR, even if they do not have a business presence within the EU. Companies are subject to GDPR if:

(1) The business has a presence in an EU country;

(2) Even if there is no presence in the EU, the company still processes personal data of European residents;

(3) There is more than 250 employees; and

(4) Even if there is fewer than 250 employees, if the data-processing impacts the rights and freedoms of its data subjects

How Do You Know If You Are Prepared?

Well, individuals and businesses have had almost two years to figure out how to ensure their compliance, so there shouldn’t be an excuse for failure to comply. But, let’s be realistic, a large number of companies are going to get hit, hard. Today marks the day in which all that effort is broadcasted to the world of consumers.

#1 –Data Breach Incident Response Plan

The biggest sign of readiness is having a data breach plan or incident response plan in place. While most companies have some form of a plan in place, they will need to review, amend, and update it, ensuring full compliance with GDPR requirements.

This is only half the battle. You better be prepared to enact it when a data breach occurs. Testing these plans is essential, otherwise, how will you know if its actually ideal? The GDPR requires that companies report breaches within 72 hours, or 3 days. How well the data response team is able to implement the plan and minimize any damage will affect how much a company is fined and/or penalized.

#2 –Hiring A Data Protection Officer (DPO)

The GDPR requires that a data protection officer (DPO) be appointed and hired. However, it doesn’t address whether it needs to actually be a discrete position, so presumably, a company could name an officer who already has a similar role to that position, so long as they are able to show their protection of personally identifiable information (PII), with no conflict of interest. GDPR allows for the DPO to work for multiple organizations, lending support for a “virtual DPO” as an option.

#3 –Create a Record or Log of Risks and Compliance Progress

Now that the clock has ticked its last tock, companies better have an updated record as to its progress made over the past two years, showing its identification of all its risks and measures taking in attempts of minimizing or eliminating those risks. This record, or Record of Processing Activities (“RoPA”), is required in Article 30 of GDPR, focusing on the inventory of risky applications and programs that may be operating.

However, another question presents itself in terms of the keeper of the log and how its maintained. The fear of manipulation, alteration, and fraud are still issues to be addressed. In the era of blockchain, having a log stored that’s stored on the blockchain that is unable to be manipulated or altered could prove extremely useful for companies moving forward.

How Does This Affect the US?

Apple MailApple Mail

When it comes to US businesses, the GDPR requirements will force them to change the way they process, store, and protect customers’ personal data. Companies must provide a “reasonable” level of data protection and privacy to its customers, ensuring its storage only upon the individual consent by those customers and no longer than absolutely necessary for which the data is processed. However, the regulation doesn’t define what “reasonable” means in terms of ensuring compliance, so this could present future complications when incidents occur and whether or not an organization took enough steps to ensure minimal damage.

Upon request, companies must erase personal data—unlike the Cambridge Analytica and Facebook data breach that is still unfolding. The right to be forgotten is a powerful right and a right we as citizens are all entitled to. However, GDPR doesn’t supersede any current legal requirement where an organization is required to maintain certain data, like HIPAA requirements.

https://www.wsj.com/articles/u-s-websites-go-dark-in-europe-as-gdpr-data-rules-kick-in-1527242038

How Does This Affect Social Media Companies?

Your mind probably just jumped to Facebook and how this will affect social media networks. As we’ve seen since Mark Zuckerberg’s congressional hearing on Capitol Hill two months ago, many social media companies and online networks have already updated their privacy policies and terms of service in anticipation of today’s deadline.

Facebook’s response is going to be closely scrutinized by European regulators in wake of the Cambridge Analytica breach as well as lingering concerns over the company’s data collection. Same with Twitter, yet no major scandal has put them in the public spotlight.

Accountable EU Representative

If you think social media platforms are exempt from this regulation, you’re thinking is also outdated. GDPR requires that social media companies have a designated EU representative that can be held accountable for the GDPR compliance of the organization within Europe.

Clear Privacy Notice

After hearing Zuckerberg’s testimony, it’s clear that users need to be presented with a simple and clear privacy notice that they can actually understand—not something that looks like a bulk collection of Harry Potter books bound together.

The Right To Be Forgotten

It will be interesting to see how these companies will deal with user requests for deletion of certain personal data. It is no longer safe for a company to assume that their customers or users are content with their personal data being held—seeing as most of the have no idea it’s held until something unfortunately happens.

I asked Arizona internet attorney, Anette Beebe, what she thought about “the right to be forgotten” and how it affects our freedom of speech.

“In the EU, under The Right to Be Forgotten, people who were once bad actors have been able to sweep their history of wrong doing under the rug. However, in the U.S., we value the freedom of speech and providing people with more information, rather so they can make informed decisions, rather than hiding it. I can understand privacy and respect that, but I don’t respect a law that helps unscrupulous people being able to hide from their misdeeds or have truthful, but unflattering information taken down just because someone doesn’t like it.”

Beebe anticipates a wave of demand letters directed to website clients, asking for content to be taken down that in reality, has no chance of being taken down. “It will be interesting to see how the courts tackle these issues moving forward,” says Beebe.

What Happens If You Fail To Comply With GDPR?

Just ask Facebook and Google who were hit with a collective $8.8 billion lawsuit (Facebook, 3.9 billion euro; Google, 3.7 billion euro) today by Austrian privacy campaigner, Max Schrems, alleging violations of GDPR as it pertains to the opt-in/opt-out clauses. Specifically, the complaint alleges that the way these companies obtain user consent for privacy policies is an “all-or-nothing” choice, asking users to check a small box allowing them to access services. What happens if you don’t choose “I accept”? You’re denied service. A clear violation of the GDPR’s provisions per privacy experts and the EU.

Failing to adhere to the GDPR has steep penalties of up to €20 million, or 4% of global annual turnover, whichever is higher. Reports estimate that about half of U.S. companies that should be compliant on GDPR requirements by today, won’t be. There’s more to it than all those emails coming to your inbox about updated privacy terms.

According to a December 2016 PwC survey, 68 percent of U.S. based companies expect to have spent $1-$10 million to meet these GDPR requirements.

But, some websites in the U.S. have decided to block their services entirely rather than adhere to the new regulations, going completely dark. Dozens of American newspapers are currently blocked in Europe and web services like Instapaper have suspended operations in the European Union for the foreseeable future.

Facebook and Google Already Hit With $8.8 Billion Lawsuit for GDPR Violations

The GDPR is no joke and nothing to mess around with. Not even one day has passed, and

Today is a big day for every business and organization in the world. Let’s hope that the companies we are loyal to, are loyal to us.