For IT chief, the goal is to deliver business value – TechTarget

Synonym for CIO: three words, 21 letters. What is it?

That’s valued business partner, said Hubert Barkley, the IT chief at Waste Industries, a garbage and recycling collection company in Raleigh, N.C.

His title is not CIO — he’s the vice president of information and technology — but he’s effectively “the CIO, the CSO, the director of IT and whatever else that they want me to do.” And his job is to understand the business side of the house and use technology to deliver business value.

He and his team are doing that through an assortment of tech projects. For example, the company uses analytics on maintenance data to determine which in its fleet of garbage trucks is going to need to be replaced. It also built a dashboard for the trucks that serves as a “coaching tool,” tracking how many bins workers picked up in a certain amount of time. So, if a driver who was supposed to pick up 700 in under 10 hours did it in 11, “we say, ‘Well, you did 11. What happened?’ They go, ‘Oh, truck broke down.'”

SearchCIO spoke with Barkley in December about how IT at Waste Industries is working to deliver business value. Edited excerpts are below.

How would you describe your role as an IT executive at Waste Industries?

Hubert Barkley

Hubert Barkley: I’m the lead visionary, if you will. I’m the CIO, the CSO, the director of IT and whatever else that they want me to do, because I’m also a — I’ll use the words valued business partner. I try to understand the business. I can speak the language of the people here, and I think it’s very critical for all IT leaders to understand the business from the business owners’ perspective — and then see how you can provide a solution that will make it successful for them, not to tell them how it’s going to be and you don’t understand what they need.

I had a conversation with my COO this morning. I said, ‘You’re responsible for the operations of this company, and you probably look at Waste Industries and say this is an operational company that picks up garbage.’ He goes, ‘Absolutely.’ I said, ‘Well, I look at it differently. I look at this as an information technology company that happens to pick up garbage. And the distinction is my job is to find leading-edge, cutting-edge technology that will allow us to do our jobs more efficiently, effectively and bring better value to our business and our investors.’ I think all CIOs need to think in this realm, regardless of what they’re doing.

My job is to find leading-edge, cutting-edge technology that will allow us to do our jobs more efficiently, effectively and bring better value to our business and our investors.
Hubert Barkleyvice president of information and technology, Waste Industries

What are some examples of how you’re using technology to deliver business value?

Barkley: We use predictive analytics to determine when we’re going to replace our trucks — in other words, we determine the life of a truck based off of its maintenance history and the environment that it runs in. So, we have multiple different kinds of trucks, and I can tell you, ‘Hey, in year seven you’re going to want to replace truck X. Because after that, even if you put a new engine and transmission in it, you’re going to get diminishing returns.’

We also do geospatial analysis. We’ll look at census data to decide where we should focus on the sales of our business based on our route density. So, if you’ve been in an area and you say, ‘Hey, here’s an area we service and, oh, by the way, the census data is telling us there’s a lot of new construction over here. There’s a lot of business over here.’ We’ll just route people to go to those places and focus on those, because for us it’s all about the density: The more we can pick up in the least amount of miles we drive is more profitable for us.

We also look at dynamic route optimization. That’s where we’re kind of like UPS. We want our trucks to run the most efficient route and the least amount of time picking up the most waste, and then we can measure that. For example, we’ve created a driver dashboard, which is a coaching tool, where we can say, ‘Hey, you were supposed to pick up 700 cans in nine and a half hours. You did it in nine and a half hours. That’s fantastic.’ If they did it in 11, we say, ‘Well, you did 11. What happened?’ They go, ‘Oh, truck broke down.’ We can do exception-based coaching, so if everybody’s doing good, you don’t need to talk to them and this thing will point out the ones who may need to be talked to or a little more information found out.

Learn about the copy data management software project that’s helping move data across a hybrid cloud-data center architecture in part one of this two-part interview.

Baltimore city government is developing a five-year tech transformation plan – Baltimore

The Baltimore city government CIO’s office is working on a new roadmap to modernize IT.

On Tuesday, the city released a draft of a new document called the “Inclusive Digital Transformation Strategic Plan.” The city says it’s the first of its kind for Baltimore.

When it comes to city government technology, there have been some bright spots in recent months, such as the community partnerships in the TECHealth program run by the Baltimore City Health Department and modernization moves to put permits online.

But the plan makes clear that a more fundamental overhaul stretching across all of city government is needed. Its development comes after Mayor Catherine Pughappointed former Intel exec Frank Johnson as the city’s CIO in September, who took the helm after a series of resignations in the city’s top tech job since 2012.

As a result of decentralized management and underfunding, “many of the city’s IT capabilities are outdated and lack the modern-day range of capabilities offered by comparable cities,” the report states. The plan seeks to lay out paths to turn that around.

The report states plainly that it’s not meant to be the final call.


“This document is not meant to detail the exact tasks necessary to implement various tech initiatives, but to simply outline the roadmap necessary to establish a tech ecosystem that reduces redundancy and cost, aligns standards, improves the public’s experience with city government and dismantles the digital divide,” the plan states.

The release on Tuesday isn’t a final draft either. The city is accepting public comments through March 16 on this website, with a final version expected to be released in April.

Read the draft plan

Even as it’s not finalized, a few key points are worth noting. One big change would be a rebrand. The plan calls for the office that’s been known as the Mayor’s Office of Information Technology to be renamed the Baltimore City Office of Information and Technology. That change would come with a new effort to centralize IT operations where appropriate.

Another focus is around updating the systems used by the city. A big section involves modernizing technology and practices, from new cloud services to introducing DevOps to the city. Developing new civic tech, including open data infrastructure and IoT, is also a focus.

Throughout, the plan also calls for the city to establish partnerships with the community. A big priority is around the city’s tech workforce, including a proposal for a city-run effort to develop a “pipeline of Baltimore-based IT talent.” Through this program, the city wants to create more tech training by partnering with existing organizations and companies, as well as public schools and colleges.

The plan also proposes creating a physical tech center with corresponding digital platform where people from inside and outside city government could work on new solutions.

There’s also a budget ask. Over five years, it calls for essentially doubling the current IT budget of $56 million.



Already a member? Sign in here

How technology became IndiGo’s passport to profitability –

If the gauge of an IT leader is the business value he or she delivers to the organization, Stephen Tame hasn’t done badly. During his stint at IndiGo he has exploited a generation of competitive technologies like big data, analytics, IoT and mobility to keep IndiGo soaring high well into the future.

Tame landed at IndiGo in the summer of 2014. As the Chief Advisor IT & Chief Digital Officer, he had his work cut out for him: implement IT to catalyze business advantage and chart strategic direction. But that was not nearly enough.

As the digital custodian of the largest domestic low-cost carrier, he had to embark on a multidimensional effort to reengineer its core business applications to help digital permeate through them. Net result: creating business value through digital initiatives.

And Tame was undeniably the right man or the job. He had logged miles of experience in the airline industry, including a decade long stint with Jetstar Airways as its CIO, Head of Group Information Technology. This breadth of experience helps him weigh business objectives and apply innovative solutions to realize them.

What is 21st Century Cures Act? – Definition from – TechTarget

The 21st Century Cures Act is a wide-ranging healthcare bill that funds medical research and development, medical device innovation, mental health research and care, opioid addiction treatment and prevention, and health information technology. The legislation provides, over 10 years, $4.8 billion to the National Institutes of Health over 10 years, $500 million to the Food and Drug Administration and $1 billion in grants to states to fight opioid addiction.

The bill, known as “Cures,” was approved by large bipartisan majorities in both the House and the Senate and was signed into law by President Barack Obama on Dec. 13, 2016. The Cures bill also significantly loosens FDA regulation of the development of pharmaceutical drugs and advanced medical devices and eliminates FDA regulation of low-risk health apps.

However, the act makes funding of most of the provisions contingent on Congress’ reallocating money for them each year. This mechanism helped secure support from fiscally conservative lawmakers who were worried that the spending bill would further strain the federal budget deficit.

Several high-profile critics among the 31 lawmakers who didn’t vote for the bill, including Democratic senators Bernie Sanders of Vermont and Elizabeth Warren of Massachusetts, criticized the bill as a giveaway to the pharmaceutical and medical device industries.

Mental and behavioral health advocates say the bill is the first major advance in a decade in funding research and treatment for people with mental illness and intervening in the early stages of psychosis, a promising development.

Some critics, though, argued that the act went too far in relaxing some privacy provisions in the name of better treatment.

Goals of the legislation

Congress intended Cures – with its broad reach across medical research and development, drugs and devices, mental health and health IT – to be a definitive step toward modernizing the U.S. healthcare system and recognizing the central roles of technology and science.

The bill also recognizes the importance of and provides funding for cutting edge research projects such as the Precision Medicine Initiative, BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies), Regenerative Medicine Innovation Project and Cancer Moonshot.

The 21st Century Act FDA language includes measures to streamline the design of clinical trials and expedite the approval of medical devices that have demonstrated potential to treat unmet medical needs and life-threatening conditions.

Impact on health IT

One of the 21st Century Cures Act interoperability provisions was among the law’s first to be put into action, with the January 2018 announcement by the Office of the National Coordinator for Health Information Technology (ONC) of the Trusted Exchange Framework.

The law includes specific language directing ONC to set up the framework – a nationwide system for sharing health data among networks run by healthcare systems, health information exchanges, insurers and other healthcare organizations.

Some of the law’s other health IT interoperability measures clarify ONC’s authority to certify health IT software to ensure that electronic protected health information is transferred securely and patients have unfettered access to their own health data.

Cures also gives the government authority to bar vendors and healthcare organizations from the practice known as information blocking, or impeding the flow of health data among healthcare providers, networks, vendors and patients.

Impact on mental health

The 21st Century Cures mental health impact is considerable, according to the American Psychiatric Association.

Among other provisions, the legislation:

  • Establishes a new position, the assistant secretary for mental health and substance abuse, intended to coordinate fragmented mental resources across the federal government
  • Creates another new position, chief medical officer at the Substance Abuse and Mental Health Services Administration (SAMHSA)
  • Requires SAMHSA to develop a strategic plan every four years to better recruit, train and retain mental health and substance abuse disorder workers
  • Reauthorizes grants to support integrated care for mental and behavioral health, train mental health workers in evidence-based care, and fund college, university and professional programs to expand internships and field placement programs
  • Strengthens enforcement of the Mental Health Parity and Addiction Equity Act of 2008, which mandates that insurers treat mental and physiological health issues equally


The bill was introduced on Jan. 6, 2015 by Rep. Suzanne Bonamici, D-Ore.

The House approved the bill on Jan. 7, 2015 and Senate approved it on Oct. 6, 2015 with an amendment and sent it back to the House, which agreed to the amendment on Nov. 30, 2016.

On Dec. 7, 2016, the Senate agreed to a new House amendment, and on Dec. 8, 2017 the bill was presented to the president, who signed it into law five days later.

Congress should close the loophole allowing warrantless digital car searches – TechCrunch

Most Americans expect the Fourth Amendment — which protects individuals from illegal searches — to extend to their digital lives.

In general, this expectation matches reality: unless law enforcement comes knocking with a warrant, the government cannot search a person’s phone or computer. However, cars are treated differently, and as “connected cars” become increasingly linked to people’s digital identities, there is a risk that police will use this exception to conduct digital searches without warrants.

Congress should close this loophole.

The Fourth Amendment is the cornerstone of people’s right to privacy and freedom from government intrusion in the United States. It requires the government to get a warrant based on probable cause before conducting a search and seizure of personal property.

The Supreme Court has found these protections important enough to update them for the digital world. For example, the court has extended warrant protections to cell phones and vehicle GPS tracking, and it is currently reviewing whether law enforcement officials should be required to get a warrant to obtain cellphone location information from wireless carriers.

However, there has been a long-standing exception for vehicles in the Fourth Amendment: law enforcement officials can stop and search a vehicle based on probable cause without having to get a warrant from a judge.

Photo: Joseph C. Justice Jr./Getty Images

For example, police officers can stop a vehicle for a routine traffic violation, and search it on the spot if the officers have probable cause that they will find contraband or the evidence of a crime. This lower standard for government searches makes sense in a physical world, where vehicles can only hold so much information and drivers can easily drive away to dispose of evidence.

But cars are changing, both in term of the amount and sensitivity of the information they can hold. Next-generation vehicles generate gigabytes of data while driving, enabling a host of new applications that enhance convenience, safety, and efficiency for drivers.

When this information can be accessed either through a display interface in the car or programmatically through an on-board computer, law enforcement could gain access to a significant amount of data about drivers without a warrant. For example, police could access in-car apps that contain sensitive information, such as navigation apps that contain travel history, social media apps that store messages and other personal information, and payment apps that contain information about past purchases.

While some of these applications require passwords, many only do so when the driver first logs in. Therefore, they would likely be unlocked when police pull over a driver.

In addition, many drivers may be intimidated into revealing their passwords during a stop, as has happened to travelers forced to unlock their phones at border crossings.

Finally, police could retrieve information stored in an on-board computer which may collect and store a variety of potentially sensitive information about drivers, including their driving behavior. Already, some police use special devices designed to circumvent built-in security measures on citizens’ phones and quickly copy their contents — similar devices could be designed for cars.

Photo: bjdlzx/Getty Images

Despite these potential risks, a car’s ability to collect information is not inherently privacy-invasive. And importantly, the automotive industry has taken pains to protect consumer privacy. For example, automakers made a series of public commitments in 2014 to establish strict privacy standards for data collected from vehicles, promising not to share consumer information with other businesses without affirmative consent — a standard that is higher than those found in other industries.

However, the auto industry cannot change the laws on digital searches. Policymakers should close this loophole to protect both citizens’ rights and support for technological progress. Congress has previously acted to close loopholes created by technological change.

For example, the Electronic Communications Privacy Act (ECPA), which limits how law enforcement can access digital information has different legal standards for obtaining email stored on a PC and email stored in the cloud. As cloud computing adoption has grown, Congress has worked to pass a legislative fix.

Just as Congress has been working to close the loophole for cloud computing, it should close the loophole created by the convergence of digital technology with vehicles. Congress should require law enforcement officials to obtain a warrant before they can access data from a vehicle.

Congress can do this while maintaining the vehicle exception for physical searches and maintaining law enforcement’s access to data held by third parties, such as automakers or wireless providers, through warrants or other lawful processes.

By upholding citizen privacy, Congress can ensure a smooth road ahead for vehicles of the future.

The New Rules of Talent Management – Harvard Business Review

Agile isn’t just for tech anymore. It’s been working its way into other areas and functions, from product development to manufacturing to marketing—and now it’s transforming how organizations hire, develop, and manage their people.

You could say HR is going “agile lite,” applying the general principles without adopting all the tools and protocols from the tech world. It’s a move away from a rules- and planning-based approach toward a simpler and faster model driven by feedback from participants. This new paradigm has really taken off in the area of performance management. (In a 2017 Deloitte survey, 79% of global executives rated agile performance management as a high organizational priority.) But other HR processes are starting to change too.

In many companies that’s happening gradually, almost organically, as a spillover from IT, where more than 90% of organizations already use agile practices. At the Bank of Montreal (BMO), for example, the shift began as tech employees joined cross-functional product-development teams to make the bank more customer focused. The business side has learned agile principles from IT colleagues, and IT has learned about customer needs from the business. One result is that BMO now thinks about performance management in terms of teams, not just individuals. Elsewhere the move to agile HR has been faster and more deliberate. GE is a prime example. Seen for many years as a paragon of management through control systems, it switched to FastWorks, a lean approach that cuts back on top-down financial controls and empowers teams to manage projects as needs evolve.

The changes in HR have been a long time coming. After World War II, when manufacturing dominated the industrial landscape, planning was at the heart of human resources: Companies recruited lifers, gave them rotational assignments to support their development, groomed them years in advance to take on bigger and bigger roles, and tied their raises directly to each incremental move up the ladder. The bureaucracy was the point: Organizations wanted their talent practices to be rules-based and internally consistent so that they could reliably meet five-year (and sometimes 15-year) plans. That made sense. Every other aspect of companies, from core businesses to administrative functions, took the long view in their goal setting, budgeting, and operations. HR reflected and supported what they were doing.

By the 1990s, as business became less predictable and companies needed to acquire new skills fast, that traditional approach began to bend—but it didn’t quite break. Lateral hiring from the outside—to get more flexibility—replaced a good deal of the internal development and promotions. “Broadband” compensation gave managers greater latitude to reward people for growth and achievement within roles. For the most part, though, the old model persisted. Like other functions, HR was still built around the long term. Workforce and succession planning carried on, even though changes in the economy and in the business often rendered those plans irrelevant. Annual appraisals continued, despite almost universal dissatisfaction with them.

Now we’re seeing a more sweeping transformation. Why is this the moment for it? Because rapid innovation has become a strategic imperative for most companies, not just a subset. To get it, businesses have looked to Silicon Valley and to software companies in particular, emulating their agile practices for managing projects. So top-down planning models are giving way to nimbler, user-driven methods that are better suited for adapting in the near term, such as rapid prototyping, iterative feedback, team-based decisions, and task-centered “sprints.” As BMO’s chief transformation officer, Lynn Roger, puts it, “Speed is the new business currency.”

With the business justification for the old HR systems gone and the agile playbook available to copy, people management is finally getting its long-awaited overhaul too. In this article we’ll illustrate some of the profound changes companies are making in their talent practices and describe the challenges they face in their transition to agile HR.

Where We’re Seeing the Biggest Changes

Because HR touches every aspect—and every employee—of an organization, its agile transformation may be even more extensive (and more difficult) than the changes in other functions. Companies are redesigning their talent practices in the following areas:

Performance appraisals.

When businesses adopted agile methods in their core operations, they dropped the charade of trying to plan a year or more in advance how projects would go and when they would end. So in many cases the first traditional HR practice to go was the annual performance review, along with employee goals that “cascaded” down from business and unit objectives each year. As individuals worked on shorter-term projects of various lengths, often run by different leaders and organized around teams, the notion that performance feedback would come once a year, from one boss, made little sense. They needed more of it, more often, from more people.

An early-days CEB survey suggested that people actually got less feedback and support when their employers dropped annual reviews. However, that’s because many companies put nothing in their place. Managers felt no pressing need to adopt a new feedback model and shifted their attention to other priorities. But dropping appraisals without a plan to fill the void was of course a recipe for failure.

Since learning that hard lesson, many organizations have switched to frequent performance assessments, often conducted project by project. This change has spread to a number of industries, including retail (Gap), big pharma (Pfizer), insurance (Cigna), investing (OppenheimerFunds), consumer products (P&G), and accounting (all Big Four firms). It is most famous at GE, across the firm’s range of businesses, and at IBM. Overall, the focus is on delivering more-immediate feedback throughout the year so that teams can become nimbler, “course-correct” mistakes, improve performance, and learn through iteration—all key agile principles.

In user-centered fashion, managers and employees have had a hand in shaping, testing, and refining new processes. For instance, Johnson & Johnson offered its businesses the chance to participate in an experiment: They could try out a new continual-feedback process, using a customized app with which employees, peers, and bosses could exchange comments in real time.

The new process was an attempt to move away from J&J’s event-driven “five conversations” framework (which focused on goal setting, career discussion, a midyear performance review, a year-end appraisal, and a compensation review) and toward a model of ongoing dialogue. Those who tried it were asked to share how well everything worked, what the bugs were, and so on. The experiment lasted three months. At first only 20% of the managers in the pilot actively participated. The inertia from prior years of annual appraisals was hard to overcome. But then the company used training to show managers what good feedback could look like and designated “change champions” to model the desired behaviors on their teams. By the end of the three months, 46% of managers in the pilot group had joined in, exchanging 3,000 pieces of feedback.

Regeneron Pharmaceuticals, a fast-growing biotech company, is going even further with its appraisals overhaul. Michelle Weitzman-Garcia, Regeneron’s head of workforce development, argued that the performance of the scientists working on drug development, the product supply group, the field sales force, and the corporate functions should not be measured on the same cycle or in the same way. She observed that these employee groups needed varying feedback and that they even operated on different calendars.

So the company created four distinct appraisal processes, tailored to the various groups’ needs. The research scientists and postdocs, for example, crave metrics and are keen on assessing competencies, so they meet with managers twice a year for competency evaluations and milestones reviews. Customer-facing groups include feedback from clients and customers in their assessments. Although having to manage four separate processes adds complexity, they all reinforce the new norm of continual feedback. And Weitzman-Garcia says the benefits to the organization far outweigh the costs to HR.


The companies that most effectively adopt agile talent practices invest in sharpening managers’ coaching skills. Supervisors at Cigna go through “coach” training designed for busy managers: It’s broken into weekly 90-minute videos that can be viewed as people have time. The supervisors also engage in learning sessions, which, like “learning sprints” in agile project management, are brief and spread out to allow individuals to reflect and test-drive new skills on the job. Peer-to-peer feedback is incorporated in Cigna’s manager training too: Colleagues form learning cohorts to share ideas and tactics. They’re having the kinds of conversations companies want supervisors to have with their direct reports, but they feel freer to share mistakes with one another, without the fear of “evaluation” hanging over their heads.

DigitalOcean, a New York–based start-up focused on software as a service (SaaS) infrastructure, engages a full-time professional coach on-site to help all managers give better feedback to employees and, more broadly, to develop internal coaching capabilities. The idea is that once one experiences good coaching, one becomes a better coach. Not everyone is expected to become a great coach—those in the company who prefer coding to coaching can advance along a technical career track—but coaching skills are considered central to a managerial career.

P&G, too, is intent on making managers better coaches. That’s part of a larger effort to rebuild training and development for supervisors and enhance their role in the organization. By simplifying the performance review process, separating evaluation from development discussions, and eliminating talent calibration sessions (the arbitrary horse trading between supervisors that often comes with a subjective and politicized ranking model), P&G has freed up a lot of time to devote to employees’ growth. But getting supervisors to move from judging employees to coaching them in their day-to-day work has been a challenge in P&G’s tradition-rich culture. So the company has invested heavily in training supervisors on topics such as how to establish employees’ priorities and goals, how to provide feedback about contributions, and how to align employees’ career aspirations with business needs and learning and development plans. The bet is that building employees’ capabilities and relationships with supervisors will increase engagement and therefore help the company innovate and move faster. Even though the jury is still out on the companywide culture shift, P&G is already reporting improvements in these areas, at all levels of management.


Traditional HR focused on individuals—their goals, their performance, their needs. But now that so many companies are organizing their work project by project, their management and talent systems are becoming more team focused. Groups are creating, executing, and revising their goals and tasks with scrums—at the team level, in the moment, to adapt quickly to new information as it comes in. (“Scrum” may be the best-known term in the agile lexicon. It comes from rugby, where players pack tightly together to restart play.) They are also taking it upon themselves to track their own progress, identify obstacles, assess their leadership, and generate insights about how to improve performance.

In that context, organizations must learn to contend with:

Multidirectional feedback. Peer feedback is essential to course corrections and employee development in an agile environment, because team members know better than anyone else what each person is contributing. It’s rarely a formal process, and comments are generally directed to the employee, not the supervisor. That keeps input constructive and prevents the undermining of colleagues that sometimes occurs in hypercompetitive workplaces.

But some executives believe that peer feedback should have an impact on performance evaluations. Diane Gherson, IBM’s head of HR, explains that “the relationships between managers and employees change in the context of a network [the collection of projects across which employees work].” Because an agile environment makes it practically impossible to “monitor” performance in the old sense, managers at IBM solicit input from others to help them identify and address issues early on. Unless it’s sensitive, that input is shared in the team’s daily stand-up meetings and captured in an app. Employees may choose whether to include managers and others in their comments to peers. The risk of cutthroat behavior is mitigated by the fact that peer comments to the supervisor also go to the team. Anyone trying to undercut colleagues will be exposed.

In agile organizations, “upward” feedback from employees to team leaders and supervisors is highly valued too. The Mitre Corporation’s not-for-profit research centers have taken steps to encourage it, but they’re finding that this requires concentrated effort. They started with periodic confidential employee surveys and focus groups to discover which issues people wanted to discuss with their managers. HR then distilled that data for supervisors to inform their conversations with direct reports. However, employees were initially hesitant to provide upward feedback—even though it was anonymous and was used for development purposes only—because they weren’t accustomed to voicing their thoughts about what management was doing.

Mitre also learned that the most critical factor in getting subordinates to be candid was having managers explicitly say that they wanted and appreciated comments. Otherwise people might worry, reasonably, that their leaders weren’t really open to feedback and ready to apply it. As with any employee survey, soliciting upward feedback and not acting on it has a diminishing effect on participation; it erodes the hard-earned trust between employees and their managers. When Mitre’s new performance-management and feedback process began, the CEO acknowledged that the research centers would need to iterate and make improvements. A revised system for upward feedback will roll out this year.

Because feedback flows in all directions on teams, many companies use technology to manage the sheer volume of it. Apps allow supervisors, coworkers, and clients to give one another immediate feedback from wherever they are. Crucially, supervisors can download all the comments later on, when it’s time to do evaluations. In some apps, employees and supervisors can score progress on goals; at least one helps managers analyze conversations on project management platforms like Slack to provide feedback on collaboration. Cisco uses proprietary technology to collect weekly raw data, or “breadcrumbs,” from employees about their peers’ performance. Such tools enable managers to see fluctuations in individual performance over time, even within teams. The apps don’t provide an official record of performance, of course, and employees may want to discuss problems face-to-face to avoid having them recorded in a file that can be downloaded. We know that companies recognize and reward improvement as well as actual performance, however, so hiding problems may not always pay off for employees.

Frontline decision rights. The fundamental shift toward teams has also affected decision rights: Organizations are pushing them down to the front lines, equipping and empowering employees to operate more independently. But that’s a huge behavioral change, and people need support to pull it off. Let’s return to the Bank of Montreal example to illustrate how it can work. When BMO introduced agile teams to design some new customer services, senior leaders weren’t quite ready to give up control, and the people under them were not used to taking it. So the bank embedded agile coaches in business teams. They began by putting everyone, including high-level executives, through “retrospectives”—regular reflection and feedback sessions held after each iteration. These are the agile version of after-action reviews; their purpose is to keep improving processes. Because the retrospectives quickly identified concrete successes, failures, and root causes, senior leaders at BMO immediately recognized their value, which helped them get on board with agile generally and loosen their grip on decision making.

Complex team dynamics. Finally, since the supervisor’s role has moved away from just managing individuals and toward the much more complicated task of promoting productive, healthy team dynamics, people often need help with that, too. Cisco’s special Team Intelligence unit provides that kind of support. It’s charged with identifying the company’s best-performing teams, analyzing how they operate, and helping other teams learn how to become more like them. It uses an enterprise-wide platform called Team Space, which tracks data on team projects, needs, and achievements to both measure and improve what teams are doing within units and across the company.


Pay is changing as well. A simple adaptation to agile work, seen in retail companies such as Macy’s, is to use spot bonuses to recognize contributions when they happen rather than rely solely on end-of-year salary increases. Research and practice have shown that compensation works best as a motivator when it comes as soon as possible after the desired behavior. Instant rewards reinforce instant feedback in a powerful way. Annual merit-based raises are less effective, because too much time goes by.

Patagonia has actually eliminated annual raises for its knowledge workers. Instead the company adjusts wages for each job much more frequently, according to research on where market rates are going. Increases can also be allocated when employees take on more-difficult projects or go above and beyond in other ways. The company retains a budget for the top 1% of individual contributors, and supervisors can make a case for any contribution that merits that designation, including contributions to teams.

Upward feedback from employees to team leaders is valued in agile organizations.

Compensation is also being used to reinforce agile values such as learning and knowledge sharing. In the start-up world, for instance, the online clothing-rental company Rent the Runway dropped separate bonuses, rolling the money into base pay. CEO Jennifer Hyman reports that the bonus program was getting in the way of honest peer feedback. Employees weren’t sharing constructive criticism, knowing it could have negative financial consequences for their colleagues. The new system prevents that problem by “untangling the two, ” Hyman says.

DigitalOcean redesigned its rewards to promote equitable treatment of employees and a culture of collaboration. Salary adjustments now happen twice a year to respond to changes in the outside labor market and in jobs and performance. More important, DigitalOcean has closed gaps in pay for equivalent work. It’s deliberately heading off internal rivalry, painfully aware of the problems in hypercompetitive cultures (think Microsoft and Amazon). To personalize compensation, the firm maps where people are having impact in their roles and where they need to grow and develop. The data on individuals’ impact on the business is a key factor in discussions about pay. Negotiating to raise your own salary is fiercely discouraged. And only the top 1% of achievement is rewarded financially; otherwise, there is no merit-pay process. All employees are eligible for bonuses, which are based on company performance rather than individual contributions. To further support collaboration, DigitalOcean is diversifying its portfolio of rewards to include nonfinancial, meaningful gifts, such as a Kindle loaded with the CEO’s “best books” picks.

How does DigitalOcean motivate people to perform their best without inflated financial rewards? Matt Hoffman, its vice president of people, says it focuses on creating a culture that inspires purpose and creativity. So far that seems to be working. The latest engagement survey, via Culture Amp, ranks DigitalOcean 17 points above the industry benchmark in satisfaction with compensation.


With the improvements in the economy since the Great Recession, recruiting and hiring have become more urgent—and more agile. To scale up quickly in 2015, GE’s new digital division pioneered some interesting recruiting experiments. For instance, a cross-functional team works together on all hiring requisitions. A “head count manager” represents the interests of internal stakeholders who want their positions filled quickly and appropriately. Hiring managers rotate on and off the team, depending on whether they’re currently hiring, and a scrum master oversees the process.

To keep things moving, the team focuses on vacancies that have cleared all the hurdles—no req’s get started if debate is still ongoing about the desired attributes of candidates. Openings are ranked, and the team concentrates on the top-priority hires until they are completed. It works on several hires at once so that members can share information about candidates who may fit better in other roles. The team keeps track of its cycle time for filling positions and monitors all open requisitions on a kanban board to identify bottlenecks and blocked processes. IBM now takes a similar approach to recruitment.

Companies are also relying more heavily on technology to find and track candidates who are well suited to an agile work environment. GE, IBM, and Cisco are working with the vendor Ascendify to create software that does just this. The IT recruiting company HackerRank offers an online tool for the same purpose.

Learning and development.

Like hiring, L&D had to change to bring new skills into organizations more quickly. Most companies already have a suite of online learning modules that employees can access on demand. Although helpful for those who have clearly defined needs, this is a bit like giving a student the key to a library and telling her to figure out what she must know and then learn it. Newer approaches use data analysis to identify the skills required for particular jobs and for advancement and then suggest to individual employees what kinds of training and future jobs make sense for them, given their experience and interests.

IBM uses artificial intelligence to generate such advice, starting with employees’ profiles, which include prior and current roles, expected career trajectory, and training programs completed. The company has also created special training for agile environments—using, for example, animated simulations built around a series of “personas” to illustrate useful behaviors, such as offering constructive criticism.

Traditionally, L&D has included succession planning—the epitome of top-down, long-range thinking, whereby individuals are picked years in advance to take on the most crucial leadership roles, usually in the hope that they will develop certain capabilities on schedule. The world often fails to cooperate with those plans, though. Companies routinely find that by the time senior leadership positions open up, their needs have changed. The most common solution is to ignore the plan and start a search from scratch. But organizations often continue doing long-term succession planning anyway. (About half of large companies have a plan to develop successors for the top job.) Pepsi is one company taking a simple step away from this model by shortening the time frame. It provides brief quarterly updates on the development of possible successors—in contrast to the usual annual updates—and delays appointments so that they happen closer to when successors are likely to step into their roles.

Ongoing Challenges

To be sure, not every organization or group is in hot pursuit of rapid innovation. Some jobs must remain largely rules based. (Consider the work that accountants, nuclear control-room operators, and surgeons do.) In such cases agile talent practices may not make sense.

And even when they’re appropriate, they may meet resistance—especially within HR. A lot of processes have to change for an organization to move away from a planning-based, “waterfall” model (which is linear rather than flexible and adaptive), and some of them are hardwired into information systems, job titles, and so forth. The move toward cloud-based IT, which is happening independently, has made it easier to adopt app-based tools. But people issues remain a sticking point. Many HR tasks, such as traditional approaches to recruitment, onboarding, and program coordination, will become obsolete, as will expertise in those areas.

Meanwhile, new tasks are being created. Helping supervisors replace judging with coaching is a big challenge not just in terms of skills but also because it undercuts their status and formal authority. Shifting the focus of management from individuals to teams may be even more difficult, because team dynamics can be a black box to those who are still struggling to understand how to coach individuals. The big question is whether companies can help managers take all this on and see the value in it.

The HR function will also require reskilling. It will need more expertise in IT support—especially given all the performance data generated by the new apps—and deeper knowledge about teams and hands-on supervision. HR has not had to change in recent decades nearly as much as have the line operations it supports. But now the pressure is on, and it’s coming from the operating level, which makes it much harder to cling to old talent practices.

Serious quantum computers are finally here. What are we going to do with them? – MIT Technology Review

Inside a small laboratory in lush countryside about 50 miles north of New York City, an elaborate tangle of tubes and electronics dangles from the ceiling. This mess of equipment is a computer. Not just any computer, but one on the verge of passing what may, perhaps, go down as one of the most important milestones in the history of the field.

Quantum computers promise to run calculations far beyond the reach of any conventional supercomputer. They might revolutionize the discovery of new materials by making it possible to simulate the behavior of matter down to the atomic level. Or they could upend cryptography and security by cracking otherwise invincible codes. There is even hope they will supercharge artificial intelligence by crunching through data more efficiently.


Yet only now, after decades of gradual progress, are researchers finally close to building quantum computers powerful enough to do things that conventional computers cannot. It’s a landmark somewhat theatrically dubbed “quantum supremacy.” Google has been leading the charge toward this milestone, while Intel and Microsoft also have significant quantum efforts. And then there are well-funded startups including Rigetti Computing, IonQ, and Quantum Circuits.

This story is part of our March/April 2018 Issue

No other contender can match IBM’s pedigree in this area, though. Starting 50 years ago, the company produced advances in materials science that laid the foundations for the computer revolution. Which is why, last October, I found myself at IBM’s Thomas J. Watson Research Center to try to answer these questions: What, if anything, will a quantum computer be good for? And can a practical, reliable one even be built?

Why we think we need a quantum computer

The research center, located in Yorktown Heights, looks a bit like a flying saucer as imagined in 1961. It was designed by the neo-futurist architect Eero Saarinen and built during IBM’s heyday as a maker of large mainframe business machines. IBM was the world’s largest computer company, and within a decade of the research center’s construction it had become the world’s fifth-largest company of any kind, just behind Ford and General Electric.

While the hallways of the building look out onto the countryside, the design is such that none of the offices inside have any windows. It was in one of these cloistered rooms that I met Charles Bennett. Now in his 70s, he has large white sideburns, wears black socks with sandals, and even sports a pocket protector with pens in it. Surrounded by old computer monitors, chemistry models, and, curiously, a small disco ball, he recalled the birth of quantum computing as if it were yesterday.

Charles Bennett of IBM Research is one of the founding fathers of quantum information theory. His work at IBM helped create a theoretical foundation for quantum computing.

bartek sadowski

When Bennett joined IBM in 1972, quantum physics was already half a century old, but computing still relied on classical physics and the mathematical theory of information that Claude Shannon had developed at MIT in the 1950s. It was Shannon who defined the quantity of information in terms of the number of “bits” (a term he popularized but did not coin) required to store it. Those bits, the 0s and 1s of binary code, are the basis of all conventional computing.

A year after arriving at Yorktown Heights, Bennett helped lay the foundation for a quantum information theory that would challenge all that. It relies on exploiting the peculiar behavior of objects at the atomic scale. At that size, a particle can exist “superposed” in many states (e.g., many different positions) at once. Two particles can also exhibit “entanglement,” so that changing the state of one may instantaneously affect the other.

Bennett and others realized that some kinds of computations that are exponentially time consuming, or even impossible, could be efficiently performed with the help of quantum phenomena. A quantum computer would store information in quantum bits, or qubits. Qubits can exist in superpositions of 1 and 0, and entanglement and a trick called interference can be used to find the solution to a computation over an exponentially large number of states. It’s annoyingly hard to compare quantum and classical computers, but roughly speaking, a quantum computer with just a few hundred qubits would be able to perform more calculations simultaneously than there are atoms in the known universe.

In the summer of 1981, IBM and MIT organized a landmark event called the First Conference on the Physics of Computation. It took place at Endicott House, a French-style mansion not far from the MIT campus.

In a photo that Bennett took during the conference, several of the most influential figures from the history of computing and quantum physics can be seen on the lawn, including Konrad Zuse, who developed the first programmable computer, and Richard Feynman, an important contributor to quantum theory. Feynman gave the conference’s keynote speech, in which he raised the idea of computing using quantum effects. “The biggest boost quantum information theory got was from Feynman,” Bennett told me. “He said, ‘Nature is quantum, goddamn it! So if we want to simulate it, we need a quantum computer.’”

IBM’s quantum computer—one of the most promising in existence—is located just down the hall from Bennett’s office. The machine is designed to create and manipulate the essential element in a quantum computer: the qubits that store information.

This lab at IBM houses quantum machines connected to the cloud.

jeremy liebman

The gap between the dream and the reality

e IBM machine exploits quantum phenomena that occur in superconducting materials. For instance, sometimes current will flow clockwise and counterclockwise at the same time. IBM’s computer usessuperconducting circuits in which two distinct electromagnetic energy states make up a qubit.

The superconducting approach has key advantages. The hardware can be made using well-established manufacturing methods, and a conventional computer can be used to control the system. The qubits in a superconducting circuit are also easier to manipulate and less delicate than individual photons or ions.

Inside IBM’s quantum lab, engineers are working on a version of the computer with 50 qubits. You can run a simulation of a simple quantum computer on a normal computer, but at around 50 qubits it becomes nearly impossible. That means IBM is theoretically approaching the point where a quantum computer can solve problems a classical computer cannot: in other words, quantum supremacy.

But as IBM’s researchers will tell you, quantum supremacy is an elusive concept. You would need all 50 qubits to work perfectly, when in reality quantum computers are beset by errors that need to be corrected for. It is also devilishly difficult to maintain qubits for any length of time; they tend to “decohere,” or lose their delicate quantum nature, much as a smoke ring breaks up at the slightest air current. And the more qubits, the harder both challenges become.

“If you had 50 or 100 qubits and they really worked well enough, and were fully error-corrected—you could do unfathomable calculations that can’t be replicated on any classical machine, now or ever,” says Robert Schoelkopf, a Yale professor and founder of a company called Quantum Circuits. “The flip side to quantum computing is that there are exponential ways for it to go wrong.”

The chips inside IBM’s quantum computer (at bottom) are cooled to 15 millikelvin.

jeremy liebman

Another reason for caution is that it isn’t obvious how useful even a perfectly functioning quantum computer would be. It doesn’t simply speed up any task you throw at it; in fact, for many calculations, it would actually be slower than classical machines. Only a handful of algorithms have so far been devised where a quantum computer would clearly have an edge. And even for those, that edge might be short-lived. The most famous quantum algorithm, developed by Peter Shor at MIT, is for finding the prime factors of an integer. Many common cryptographic schemes rely on the fact that this is hard for a conventional computer to do. But cryptography could adapt, creating new kinds of codes that don’t rely on factorization.

This is why, even as they near the 50-qubit milestone, IBM’s own researchers are keen to dispel the hype around it. At a table in the hallway that looks out onto the lush lawn outside, I encountered Jay Gambetta, a tall, easygoing Australian who researches quantum algorithms and potential applications for IBM’s hardware. “We’re at this unique stage,” he said, choosing his words with care. “We have this device that is more complicated than you can simulate on a classical computer, but it’s not yet controllable to the precision that you could do the algorithms you know how to do.”

What gives the IBMers hope is that even an imperfect quantum computer might still be a useful one.

Gambetta and other researchers have zeroed in on an application that Feynman envisioned back in 1981. Chemical reactions and the properties of materials are determined by the interactions between atoms and molecules. Those interactions are governed by quantum phenomena. A quantum computer can—at least in theory—model those in a way a conventional one cannot.

Last year, Gambetta and colleagues at IBM used a seven-qubit machine to simulate the precise structure of beryllium hydride. At just three atoms, it is the most complex molecule ever modeled with a quantum system. Ultimately, researchers might use quantum computers to design more efficient solar cells, more effective drugs, or catalysts that turn sunlight into clean fuels.

Those goals are a long way off. But, Gambetta says, it may be possible to get valuable results from an error-prone quantum machine paired with a classical computer.

From a physicist’s dream to an engineer’s nightmare

“The thing driving the hype is the realization that quantum computing is actually real,” says Isaac Chuang, a lean, soft-spoken MIT professor. “It is no longer a physicist’s dream—it is an engineer’s nightmare.”

Chuang led the development of some of the earliest quantum computers, working at IBM in Almaden, California, during the late 1990s and early 2000s. Though he is no longer working on them, he thinks we are at the beginning of something very big—that quantum computing will eventually even play a role in artificial intelligence.

But he also suspects that the revolution will not really begin until a new generation of students and hackers get to play with practical machines. Quantum computers require not just different programming languages but a fundamentally different way of thinking about what programming is. As Gambetta puts it: “We don’t really know what the equivalent of ‘Hello, world’ is on a quantum computer.”

We are beginning to find out. In 2016 IBM connected a small quantum computer to the cloud. Using a programming tool kit called QISKit, you can run simple programs on it; thousands of people, from academic researchers to schoolkids, have built QISKit programs that run basic quantum algorithms. Now Google and other companies are also putting their nascent quantum computers online. You can’t do much with them, but at least they give people outside the leading labs a taste of what may be coming.

The startup community is also getting excited. A short while after seeing IBM’s quantum computer, I went to the University of Toronto’s business school to sit in on a pitch competition for quantum startups. Teams of entrepreneurs nervously got up and presented their ideas to a group of professors and investors. One company hoped to use quantum computers to model the financial markets. Another planned to have them design new proteins. Yet another wanted to build more advanced AI systems. What went unacknowledged in the room was that each team was proposing a business built on a technology so revolutionary that it barely exists. Few seemed daunted by that fact.

This enthusiasm could sour if the first quantum computers are slow to find a practical use. The best guess from those who truly know the difficulties—people like Bennett and Chuang—is that the first useful machines are still several years away. And that’s assuming the problem of managing and manipulating a large collection of qubits won’t ultimately prove intractable.

Still, the experts hold out hope. When I asked him what the world might be like when my two-year-old son grows up, Chuang, who learned to use computers by playing with microchips, responded with a grin. “Maybe your kid will have a kit for building a quantum computer,” he said.

Keep up with the latest in AI chips at EmTech Digital.

The Countdown has begun.

March 25-26, 2019

San Francisco, CA

Register now

DOST: Advancing science, technology agenda best option for PHL growth – Business Mirror

Last updated on February 21st, 2018 at 09:10 pm

IN a recent meeting with members of the Makati Business Club and several foreign chambers of commerce, the government’s chief scientist Fortunato dela Peña encouraged local and foreign businessmen to invest in technology-related enterprises. Dela Peña, Secretary of the Department of Science and Technology (DOST), said this is relevant as the government is now investing heavily in science and technology.

The chief scientist cited advancement in health and medicine development with the county’s numerous traditional medicinal herbs as focus, education, energy, disaster resiliency, and climate-change adaptation, including enterprises that deal with creativity such as designs.

A civil servant for two decades now holding various teaching and civil-service positions up to his appointment as top official of the DOST under the present administration, Dela Peña said that for a time, the country seemed to have grown “resistant” to science- and technology-related endeavors, although a core number of advocates persisted in pushing the science agenda.

The progressive minds seemed to have prevailed, the DOST official said.

Recently, the Philippines ranked 73rd out of 128 economies in terms of Science and Technology and Innovation (STI) index, citing the country’s strength in research and commercialization of STI ideas. The report also said that 60 percent of companies in the country offer training to improve the technical skill of their employees.


HOWEVER, a study by the Philippine Institute for Development Studies highlighted the weak ties between innovation-driven firms and the government, and it also identified the country’s low expenditure in research and development (R&D).

According to Dela Peña, this is the reason the government is now extending all its efforts to reach out with the private sector, explaining that STI plays an important role in economic and social progress and is a key driver for a long-term growth of an economy.

Technology adoption, the official said, allows a country’s firms and citizens to benefit from innovations created in other countries, and allows it to catch up and even leap-frog obsolete technologies.

This can lead to significant improvements in productivity for existing firms in agriculture, industry and services.

For one, long-term investments in building local capacity for technology generation can lead to innovations that will give local firms a competitive advantage. This can result in the creation of new firms and even entirely new industries.

For another, the local medicine sector has been showing potential, the DOST official said, citing the case of two dozen local herbs or medicinal plants being studied as one example.


WHEN asked about the case of Lagundi (Vitex negundo), whose efficacy as medicine is being challenged by drug manufacturers, DOST Assistant Secretary for International Cooperation Leah J. Buendia said that the shrub was subjected to 20 years of stringent clinical trials and has been proven consistently as effective.

“But since the DOST, and the government for that matter, is not into commercialization, private companies are the ones who manufacture the component of the medicinal plant into commercially available medicines,” Buendia said, adding that the agency only gives the results which include the right formula and volume of the medicinal component.

Asked on the possibility that private manufacturers might knowingly dilute the required strength for the medicine to be effective to cut cost and unwittingly made the medicine commercially available ineffective, the official declined to comment.

She, however, assured that the private sector is working with the agency for the purpose of commercializing discoveries made and studied by the DOST in partnership with the private sector, as these “products would not help private companies profit but advance the country’s agenda.”

The science agenda that, despite advances, is still in need of prioritization and more funding. This agenda is in the Philippine Development Plan 2017-2022, which devotes an entire chapter on STI.

STI culture

RECENT positive developments and advancement in science and technology notwithstanding, there remains a low level of innovation in the country. This is brought by weaknesses in STI human capital, low R&D expenditure and weak linkages in the STI ecosystem.

In the Global Innovation Index (GII) Report last year, the Philippines ranked 74th among 128 economies in terms of overall innovation, garnering a score of 31.8 out of 100. This is a slight improvement from the score of 31.1, ranking 83rd out of 141 economies in 2015.

The country also ranked fifth out of seven members of the Association of Southeast Asian Nations (Asean) that were included in the survey. The Philippines was ahead of Cambodia (95th) and Indonesia (88th) but lagged behind Singapore (6th), Malaysia (35th), Thailand (52nd) and Vietnam (59th).

The factors behind the weak performance of the STI sector include a weak STI culture, Dela Peña said.

There is lack of public awareness and interest in STI and many sectors do not recognize, appreciate and understand the use of technology- and science-based information in their daily activities.

There’s also a number of weaknesses in social and professional culture, i.e., research culture in universities, commercialization of results from public research, among others. According to Dela Peña, a lack of awareness on intellectual property rights, in the research community and the general public, still persists.

Despite its availability, adoption and application of technologies among micro, small and medium enterprises (MSMEs) and sectors like agriculture and fisheries remains low, he added.


LOW government and private spending on STI is another factor behind the weak performance of the STI sector, according to the GII report.

Investments in R&D are central for enhancing the country’s innovation ecosystem, the report said. Expenditures on R&D and innovation activities, as well as the support given to the development of human resources in various fields of science and technology (S&T), are the parameters scrutinized in the monitoring and evaluation of STI.

While nominal R&D expenditures increased by 80 percent to P15.92 billion in 2013, the proportion of R&D spending to GDP stood at only 0.14 percent. This is substantially below the 1-percent benchmark recommended by the United Nations Educational, Scientific and Cultural Organization (Unesco) and the global average of 2.04 percent. It is also low compared to other Asean countries, such as Vietnam, 0.19 percent, Thailand with 0.36 percent, Malaysia with 1.09 and Singapore’s 2.0 percent. The data is available online from Unesco’s Institute for Statistics.

The country’s relatively low ranking in the GII Report was pulled down by weaknesses in human capital and R&D, with a score of 22.7 out of 100, ranking 95th. This is due to the low public and private expenditures on education and R&D, as well as low tertiary inbound mobility. Tertiary inbound mobility refers to the number of students from abroad studying in a given country, as a percentage of the total tertiary or college enrollment.

The bulk of the R&D spending, about 60 percent, comes from the public sector. These were directed to agricultural and industrial production and technology, protection and improvement of human health, control and care of the environment, among others. Most of the R&D activities in the country are still concentrated in the National Capital Region, Calabarzon and Central Luzon.


ANOTHER indicator measuring the capacity for technology generation is the number of S&T human resources engaged in R&D.

As of 2013, the country has a total of 36,517 R&D personnel, of which 26,495 are key researchers, scientific, technological and engineering personnel engaged in R&D; the rest are technicians and support personnel.

The figures denote that there are only 270 researchers for every one million Filipinos. Such ratio falls short of the Unesco norm of 380 per million population and the 1,020 researchers per million population average across developing economies of East Asia and the Pacific.

Of the total researchers in the country from the government, higher educational institutions (HEIs) and private nonprofit sectors, 14 percent had doctoral degrees (PhD), 38 percent had master’s degrees, while 34 percent had Bachelor of Science (BS) up to post-BS degrees. The low number of researchers in the country reflects the propensity of the educational system in the country to produce graduates outside of science, technology, engineering and mathematics, or Stem, programs—the disciplines where R&D flourishes. Nevertheless, the latest GII report indicates that in terms of graduates in science and engineering, the country garnered a score of 25.5 out of 100, ranking 26th.


AN assessment of the country’s innovation system conducted by a program of the United States Agency for International Development (Usaid) reveals that the supply of Stem graduates exceeds local demand.

As a result, there is an out-migration and, worse, underemployment of many skilled, locally trained scientists and engineers. The report by the Usaid’s Science, Technology, Research and Innovation for Development, or Stride, program also cited a shortage in training for fields critical for innovation, particularly in information technology. Such shortage contributes to the challenge that many local companies face, especially in securing employees with the skills required to grow the business.

This somewhat explains the nature of brain drain the country has. It is not so much because of Filipinos not being “nationalistic” but simply because there is limited opportunity for people of science to stay in the country.

However, Buendia said the issue of nationalism has some credence, if not the absolute answer, citing the case of South Korea in the 1950s.

When South Korea was in its lowest in terms of economic level, the government called on all its scientists and engineers scattered around the world to go home and help build their economy, and many responded, she said.

According to Buendia, this brain drain contributes to the problem as potential researchers, scientists and engineers, the key actors for the innovation ecosystem to flourish, prefer to seek employment overseas due to better economic opportunities and potential for advancement. Since knowledge and technology are mostly embodied in human resources, this emphasizes the urgency to accelerate the development of R&D human resource.


THE output of R&D is commonly measured in terms of patents applied and granted to Filipino residents.

However, reports show that many universities do not have the expertise to market their patent portfolios for commercial use. Furthermore, technology generators face persisting issues on technology ownership while researchers are constrained by the “publish or perish” phenomenon.

This results in the weak technology transfer system in the country.

An annual average of 209 patent utility models and 597 industrial design applications were filed from 2005 to 2015. In the same period, an annual average of 54 patents, 446 utility models and 502 industrial designs were granted.

In 2016, the World Economic Forum (WEF) ranked the Philippines 86th out of 128 economies for the number of patents filed under the Patent Cooperation Treaty per million population. Invention patents granted to local inventors represent the smallest share in number of intellectual properties granted from 2001 to 2003. Industrial design and utility models consistently comprise the majority of the intellectual property granted.

The country also needs to catch up in research publications since the number of scientific publication in peer-reviewed journals per million population stands at 55, substantially below that of Asean member-states like Singapore with its staggering 10,368, Malaysia with 1,484, Thailand with 478 and Vietnam with 105.


ANOTHER factor behind the weak performance of the STI sector is the weak linkages among players in the STI ecosystem.

The 2009 survey of Innovation Activities and the 2014 Usaid-Stride Assessment of the Philippine Innovation Ecosystem discovered that innovation actors have weak cooperation, partnerships and trust among themselves. Most HEIs perceive collaboration with companies as outside of their core missions and a risk to exploitation.

Consequently, firms report that difficulties in convincing HEIs of their shared interests stem from resentment, suspicion and distrust. In effect, firms end up with little technical assistance from the government and research institutions.

Another factor in this equation is restrictive regulations that hamper implementation of R&D programs and projects.

The tedious government procurement process hobbles the immediate procurement of equipment and other needed materials for research, which, in turn, delays the implementation of R&D projects, the GII report said. This was confirmed by the Usaid-Stride study, which revealed that restrictive regulations make the procurement of equipment and consumables for research extremely slow and unnecessarily complex, decreasing research productivity, publication potential, and speed-to-market of innovation.

In addition, the report said the government research grants do not compensate universities for the salary of faculty members’ research activities. This practice is rarely seen outside the Philippines.

The final factor in the weak performance of the STI sector is inadequacy of an STI infrastructure that includes laboratory facilities, testing facilities and R&D centers.

Many existing hubs need upgrading to improve their services, which contributes to the lack of absorptive capacity in research institutions, the Usaid-Stride report said. It also cited that the public institutions failed to provide young researchers with equipment packages, particularly those returning from PhD studies abroad with more advanced research agendas.

The country’s leading research institutions also remain concentrated in Luzon.


DESPITE the many inadequacies, from funding to human capital, there are some technology-intensive research and capacity-building projects which resulted in products which are currently being used successfully.

One is the micro-satellite.

In April 2016, the country launched into space its first micro-satellite called Diwata-1. It was designed, developed and assembled by Filipino researchers and engineers under the guidance of Japanese experts. The Diwata (deity in English) satellite provides real-time, high-resolution and multi-color infrared images for various applications, including meteorological imaging, crop and ocean productivity measurement and high-resolution imaging of natural and man-made features.

It enables a more precise estimate of the country’s agricultural production, provides images of watersheds and floodplains for a better understanding of water available for irrigation, power and domestic consumption. The satellite also provides accurate information on any disturbance and degradation of forest and upland areas.

The country also has the Nationwide Operational Assessment of Hazards (Noah), which uses the Lidar (light detection and ranging) technology. Project NOAH was initiated in June 2012 to help manage risks associated with natural hazards and disasters. The project developed hydromet sensors and high-resolution geo-hazard maps, which were generated by light detection and ranging technology for flood modeling.

Noah helps the government in providing timely warning with a lead time of at least six hours in the wake of impending floods.

According to Buendia, the country is now training the Cambodians on this technology, as part of the partnerships among Asean countries, just like in the case of Japan which assisted the country’s scientists and engineers in building its first micro-satellite.

Another hope lies in the so-called Intelligent Operation Center Platform.

Established through a collaboration between the local government of Davao City and IBM Philippines Inc., the center resulted in the creation of a dashboard that allows authorized government agencies, such as police, fire and anti-terrorism task force, to use analytics software for monitoring events and operations in real time.


THE DOST, in cooperation with HEIs and research institutions, established advanced facilities that seek to spur R&D activities and provide MSMEs access to testing services needed to increase their productivity and competitive advantage.

One is the Advanced Device and Materials Testing Laboratories. The center houses advanced equipment for failure analysis and materials characterization to address advanced analytical needs for quality control, materials identification and R&D. Closely related to this facility is the Electronics Products Development Center, used to design, develop and test hardware and software for electronic products.

There are also high-performance computing facilities that perform tests and run computationally intensive applications for numerical weather prediction, climate modeling, as well as analytics and data modeling and archiving.

The Philippines could also boast of its Genome Center, a core facility that combines basic and applied research for the development of health diagnostics, therapeutics, DNA forensics and preventive products, and improved crop varieties.

According to Buendia, the country also has drug-discovery facilities, which address the requirements for producing high-quality and globally acceptable drug candidates. She said the Philippines also has nanotechnology centers, which provide technical services and enabling environment for interdisciplinary and collaborative R&D in various nanotechnology applications.

Buendia said there are also radiation processing facilities that are used to degrade, graft, or crosslink polymers, monomers, or chemical compounds for industrial, agricultural, environmental and medical applications. The Philippines could also boast of its Die and Mold Solutions Center, which enhances the competitiveness of the local tool and die sector through the localization of currently imported dies and molds.

These reflect that we are advancing, albeit slowly, to a culture that embraces STI as a sure path to growth, according to Dela Peña.


Alladin S. Diega

Alladin S. Diega is working with BusinessMirror as a correspondent since 2013. Mr. Diega is currently covering the Metro Page. He studied BS Journalism at the Lyceum of the Philippines and has also worked with various non-government organizations. He is currently into breeding African Lovebirds as a hobby.

30 Incredibly Useful Things You Didn’t Know Slack Could Do – Fast Company

4. Save your place in a channel or direct message by holding down the Alt or Option key and then clicking any timestamp (or long-pressing on a timestamp from the mobile app and selecting “Mark unread”). The next time you open that thread on any device, you’ll be taken directly to the point you set.

5. Get where you need to go faster by letting Slack’s Quick Switcher beam you directly to any channel, direct message, or team—no clicking or navigation required. Just hit Ctrl-K or Cmd-K and type the first letter of your desired destination. You’ll see a list of options appear and can select the one you want or type additional letters to narrow it down further. (Bonus tip: If you’re using one of the Slack desktop apps, Ctrl-T or Cmd-T will do the same thing. Take your pick!)

6. Make it easy to find important messages or files by pinning them to a channel or private message. Click the three-dot menu icon above a message (or long-press it from the mobile app) and select “Pin to this conversation.” The item will then appear within a pushpin icon at the top of the thread, where everyone will see and be able to access it as needed.

Put a pin in something important so everyone can find it with ease.

7. Catch up on activity quickly by putting Slack’s All Unreads feature to use. First, make sure the feature is activated within the Sidebar section of the desktop app’s preferences. Then just look for the “All Unreads” line in the sidebar, and start there to see a single centralized list of everything you haven’t read.

8. When a channel gets too cluttered for you to focus, let Slack hide all the inline image previews—including, yes, any animated GIFs your Giphy-loving colleagues have summoned. Type “/collapse” to send ’em all a-packin’, then type “/expand” if and when you want everything back.

Get Your Message Across

9. You may know about Slack’s basic text-formatting commands—asterisks around text for bold, underscores for italics, but the app also has advanced options that’ll let you further transform the appearance of your words. To wit:

The Case Against Google – The New York Times

Standard Oil’s technological discoveries gave the company huge advantages over its rivals, and Rockefeller exploited those advantages ruthlessly. He cut secret deals with railroads so that other firms had to pay more for transportation. He forced smaller refineries to choose between selling out to him or facing bankruptcy. “Rockefeller and his associates did not build the Standard Oil Co. in the boardrooms of Wall Street,” wrote Ida Tarbell, a muckraking journalist of the day. “They fought their way to control by rebate and drawback, bribe and blackmail, espionage and price cutting, and perhaps more important, by ruthless, never slothful efficiency of organization.”

In 1906, President Theodore Roosevelt ordered his Justice Department to sue Standard Oil for antitrust violations. But government lawyers faced a quandary: It wasn’t illegal for Standard Oil to be a monopoly. It wasn’t even illegal to compete mercilessly. So government prosecutors found a new argument: If a firm is more powerful than everyone else, they said, it can’t simply act like everyone else. Instead, it has to live by a special set of rules, so that other companies get a fair shot. “The theory was that competition is good, and if a monopoly extinguishes competition, that’s bad,” says Herbert Hovenkamp, co-author of a seminal treatise on antitrust law. “Once you become a monopoly, you have to start acting differently, and if you don’t, then what you’ve been doing all along starts breaking the law.”

The Supreme Court agreed and split Standard Oil into 34 firms. (Rockefeller received stock in all of them and became even wealthier.) In the decades following the Standard Oil breakup, antitrust enforcement generally abided by a core principle: When a company grows so powerful that it becomes a gatekeeper, and uses that might to undermine competitors, then the government should intervene. And in the last century, as courts have censured other monopolies, academics and jurists have noticed a pattern: Monopolies and technology often seem intertwined. When a company discovers a technological advantage — like the innovations of Rockefeller’s scientists — it sometimes makes that firm so powerful that it becomes a monopoly almost without trying very hard. Many of the most important antitrust lawsuits in American history — against IBM, Alcoa, Kodak and others — were rooted in claims that one company had made technological discoveries that allowed it to outpace competitors.

For decades, there seemed to be a consensus among policymakers and business leaders (though not always among targeted companies) about how the antitrust laws should be enforced. But around the turn of this century, a number of tech companies emerged that caused some people to question whether the antitrust formula made sense anymore. Firms like Google and Facebook have become increasingly useful as they have grown bigger and bigger — a characteristic known as network effects. What’s more, some have argued that the online world is so fast-moving that no antitrust lawsuit can keep pace. Nowadays even the biggest titan can be defeated by a tiny start-up, as long as the newcomer has better ideas or faster tech. Antitrust laws, digital executives said, aren’t needed anymore.

Consider Microsoft. The government spent most of the 1990s suing Microsoft for antitrust violations, a prosecution that many now view as a complete waste of time and money. When Microsoft’s chief executive, Bill Gates, signed a consent decree to resolve one of its monopoly investigations in 1994, he told a reporter that it was essentially pointless for the company’s various divisions: “None of the people who run those divisions are going to change what they do or think.” Even after a federal judge ordered Microsoft broken into separate companies in 2000, the punishment didn’t take. Microsoft fought the ruling and won on appeal. The government then offered a settlement so feeble that nine states begged the court to reject the proposal. It was approved.

What eventually humbled Bill Gates and ended Microsoft’s monopoly wasn’t antitrust prosecutions, observers say, but a more nimble start-up named Google, a search engine designed by two Stanford Ph.D. dropouts that outperformed Microsoft’s own forays into search (first MSN Search and now Bing). Then those two dropouts introduced a series of applications, like Google Docs and Google Sheets, that eventually began to compete with almost every aspect of Microsoft’s businesses. And Google did all that not by relying on government prosecutors but by being smarter. You don’t need antitrust in the digital marketplace, critics argue. “When our products don’t work or we make mistakes, it’s easy for users to go elsewhere because our competition is only a click away,” Google’s co-founder, Larry Page, said in 2012. Translation: The government ought to stop worrying, because no online giant will ever survive any longer than it deserves to.

Once was available to everyone, the company’s honeymoon lasted precisely two days. During its first 48 hours, the Raffs saw a rush of traffic from users typing product queries into Google and other search engines. But then, suddenly, the traffic stopped. Alarmed, Adam and Shivaun began running diagnostics. They quickly discovered that their site, which until then had been appearing near the top of search results, was now languishing on Google, mired 12 or 15 or 64 or 170 pages down. On other search engines, like MSN Search and Yahoo, Foundem still ranked high. But on Google, Foundem had effectively disappeared. And Google, of course, was where a vast majority of people searched online.