India’s IT Sector – Growing Opportunities for Investment – India Briefing

India holds prominent status for its capabilities in the outsourcing of information technology (IT) services; it accounts for about 67 percent of the global outsourcing market.

Drawing investor appeal is India’s ecosystem of skilled professionals with English language fluency, high-quality infrastructure (connectivity, business centers, and educational institutions), and supportive government policies.

Moreover, building upon its early outsourcing legacy, India is now evolving into a value-added destination – making cost-efficient technology for global clients and opening new industry verticals in the fields of system integration, consulting services, and technology-enabled business services, popularly known as SMAC.

While leading multinational Indian firms such as Tata Consultancy Services, Infotech, Wipro, Cognizant, and Infosys dominate India’s IT sector, the country also houses a large number of small and medium size (SME) players that contribute to over 30 percent of the country’s IT exports.

While SME service providers have their own focused, niche IT segments, the larger multinational companies (MNCs) offer a wide range of services in multiple verticals, including infrastructure management and business support services. Indian MNCs have a presence in over 60 countries, and account for about 70 percent of India’s total IT export revenue.

IB 2017 10 issue_page7

According to the industry body – National Association of Software and Services Companies (NASSCOM) – IT exports from India will exceed the US$330 billion by 2019-20. This accounts for nearly 14 percent of the projected world-wide spend if India maintains its current share of the global offshore IT market.

Below we highlight what makes India an attractive IT investment destination, and why is now the right time to enter the Indian market for foreign SMEs.

Advantage India – Talent acquisition, wide labor pool

India has a strong mix of young and technically trained professionals. Government institutes such as the Indian Institute of Technology (IIT), the Indian Institute of Management (IIM), and the Indian School of Business (ISB) are known to produce high quality STEM (science, technology, engineering, and math) talent with diverse business skills.

These professionals not only offer wage arbitrage to companies, but also enable the industry to address growing domestic and global opportunities in digital transformation, while maintaining high productivity rates. In the year 2016, over six million graduates were estimated to have been added to India’s total labor force, over half of whom have skill sets that are well suited to the IT sector.

Global quality standards

Indian IT vendors follow international best practices and have adopted several global quality certification standards, including:

  • The Capability Maturity Model (CMM) that assesses the ability of companies’ processes to implement a contracted software project;
  • The International Standards Organization (ISO 9000) series that applies to quality assurance in design, development, production, installation and servicing of software;
  • Total Quality Management (TQM);
  • Six Sigma Quality certification; and,
  • Customer Operations Performance Centre (COPC) certification that confirms the ability to deliver quality software development on time. In fact, over 75 percent of the world’s CMM (level 5) certified companies are Indian. India also has one of the highest number of ISO-9000 certified software companies in the world.

Low service, labor costs

The quality and cost of services are two key variables that makes India the most preferred sourcing destination in the world.

India can help companies generate cost savings between 40 and 60 percent, depending on the type of services offered, in comparison to leading software markets like the US. For instance, hiring a software developer costs between US$50 to US$150 an hour (depending on the tool being developed) on average in the US, while they can be hired for as low as US$15 to US$45 an hour in India.

This pricing flexibility is particularly important for SMEs in the sector as they are able to operate more efficiently on competitive budgets and gain larger profit margins on final products.

New IT hubs in emerging tier II and tier III cities

As basic infrastructure improves across the country, the geographical spread of IT companies in India is expanding to cover lower tier cities, such as Ahmedabad and Surat (Gujarat state), Bhubaneshwar (Orissa state), Chandigarh and Ludhiana (Punjab state), Coimbatore (Tamil Nadu state), Jaipur (Rajasthan state), Kochi and Thiruvananthapuram (Kerala state), Mangalore (Karnataka state), and Nagpur (Maharashtra state).

IB 2017 10 issue_page12

Lower labor costs, affordable real estate, and the establishment of software technology parks (STPI) and Special Economic Zones (SEZs) in these cities are attractive for IT companies.

Lower tier cities have given rise to a domestic hub and spoke model wherein tier I cities are hubs with a network of tier II and III spokes. The model allows organizations to diversify their operations to tier II and III cities at low costs while maintaining their high profile business clientele in tier I metro cities.

Federal support, digital infrastructure

The Indian government has adopted a framework of policies that offer advantages to the IT outsourcing industry. These include national policies, financial incentives, and the development of infrastructure, which complement the local state-level initiatives targeting the IT sector.

IB 2017 10 issue_page11

At present, the government’s Digital India campaign is injecting US$20 billion in investments to improve internet infrastructure in the country, and facilitate online payments systems and e-governance, among other things.

The Startup India campaign, meanwhile, supports technology startups that are seen as critical to the growth and development of the technology industry’s SMAC capabilities. SMAC stands for social media, mobility, data analytics, and cloud computing, and India’s over 4,000 startups are at the frontier of this innovation in the rapidly transforming IT landscape.

IT firms in India can take advantage of fiscal incentives offered by the government to Export Oriented Units (EOUs), STPIs, and SEZs. Further, the government provides a reduction in the tax rate on royalty and fees from technical services from 25 percent to 10 percent.

Aside from establishing software technology parks, the federal State Wide Area Networks (SWANs) scheme has established networks across all 29 states and six union territories, with a minimum bandwidth capacity of two Mbps, and at a total cost of US$512.69 million.

Time to invest in India

For foreign SMEs looking for a cost-effective IT destination, now is the right time to invest in India. The sector is in the middle of an exciting flux as new technologies are becoming mainstream, creating further demand to match increased service competencies.

Once the Indian IT sector consolidates its business, training, and infrastructure along SMAC lines, they will be able to compete with global MNCs as full IT service providers, rather than just maintenance providers and back-end support providers. In 2017, NASSCOM reported that the sector showed a 10.5 percent growth focus on developing high-end services.

Consequently, new niche segments are accelerating the overall expansion of India’s IT sector capabilities, creating new digital solutions and technology verticals that serve industries as diverse as healthcare, finance, transportation, and manufacturing. Across the Indian economy, organizations are now spending up to 45 percent of their IT budgets on optimizing traditional IT infrastructure, developing high-skilled talent, and innovating digital processes to make their business adaptive and resilient.

This R&D disruption, however, has entailed necessary and interim layoffs in the sector, as it upgrades to the next development stage.

Foreign companies engaged in providing back-end (outsourcing) services can effectively tap into excess labor in the Indian market, at a competitive cost.

Meanwhile, the country’s IT ecosystem will continue to develop, including the associated education and training sectors, providing foreign entrants the upper edge they need in the global market.

About Us

India Briefing is published by Asia Briefing, a subsidiary of Dezan Shira & Associates. We produce material for foreign investors throughout Eurasia, including ASEAN, China, Indonesia, Russia, the Silk Road, & Vietnam. For editorial matters please contact us here and for a complimentary subscription to our products, please click here.

Dezan Shira & Associates provide business intelligence, due diligence, legal, tax and advisory services throughout India and the Asian region. We maintain offices in Delhi and Mumbai and throughout China, South-East Asia, India, and Russia. For assistance with India investment issues or into Asia overall, please contact us at or visit us at

Why So Many High-Profile Digital Transformations Fail – Daily

Yagi Studios/Getty Images

In 2011, GE embarked upon an ambitious attempt to digitally transform its product and service offerings. The company created impressive digital capabilities, labeling itself a “digital industrial” company, embedding sensors into many products, building a huge new software platform for the Internet of Things, and transforming business models for its industrial offerings. GE also went to work on transforming internal processes like sales and supplier relationships. Some performance indicators, including service margins, began to improve. The company received much acclaim for its transformation in the press (including some from us).

However, investors didn’t seem to acknowledge its transformation. The company’s stock price has languished for years, and CEO Jeff Immelt — a powerful advocate of the company’s digital ambitions — recently departed the company under pressure from activist investors. Other senior executives have left as well. The new CEO, John Flannery, is focused primarily on cutting costs.

GE is hardly the only company to run into performance issues and sooner-than-expected executive departures in the midst of a huge digital transformation effort. Lego recently defunded its Digital Designer virtual building program. Nike halved the size of its digital unit in 2014 by discontinuing its Nike+ Fuelband activity tracker and some other investments. Procter & Gamble wanted to become “the most digital company on the planet” in 2012, but ran into growth challenges in a difficult economy. Burberry set out to be the world’s best digital luxury brand, but performance began to suffer after initially improving. Ford invested heavily in digital initiatives only to see its stock price lag due to cost and quality issues elsewhere in the company. These companies spent millions to develop digital products, infrastructures, and brand accompaniments, and got tremendous media and investor attention, only to encounter significant performance challenges, and often shareholder dissent. At P&G, then-CEO Bob McDonald was asked to leave by his board, as was Ford CEO Mark Fields. At Lego and Burberry, the CEOs leading the digital charge stepped into lesser roles.

What can we learn from these examples of digital dreams deferred? How did these smart, experienced leaders make decisions that don’t look so smart in hindsight? They made the investments, they got a lot of exciting feedback from their digital leaders and from the press, they increased the investments, and the cycle repeated. However, while their companies had plenty of resources, the big digital bets did not pay off quickly enough, or richly enough, to counter the drain they represented on the rest of the business.

We think there’s something more here than executive over-exuberance or slowing markets. This kind of unfortunate decision has happened over and over again, in wave after wave of transformative business technology. It happened with e-commerce, when companies like Staples and Walmart invested heavily in separate e-commerce units, only to have those units drain the company of resources. It happened with analytics and big data, when companies like Sears and Zynga invested millions in creating analytics units that never paid back their investments. And now it’s happening with digital transformation.

Several key lessons emerge when heavy commitments to digital capability development meet basic financial performance problems. A clear one is that there are many factors, such as the economy or the desirability of your products, that can affect a company’s success as much or more than its digital capabilities. Therefore, no managers should view digital — or any other major technological innovation — as their sure salvation.

Second, digital is not just a thing that you can you can buy and plug into the organization. It is multi-faceted and diffuse, and doesn’t just involve technology. Digital transformation is an ongoing process of changing the way you do business. It requires foundational investments in skills, projects, infrastructure, and, often, in cleaning up IT systems. It requires mixing people, machines, and business processes, with all of the messiness that entails. It also requires continuous monitoring and intervention, from the top, to ensure that both digital leaders and non-digital leaders are making good decisions about their transformation efforts.

Third, it’s important to calibrate your digital investments to the readiness of your industry — both customers and competitors. For example, when P&G was making its digital push in 2012 and 2013, it was already well ahead of most companies — and perhaps all of them — in the consumer products industry. Our assessment is that it probably still is, even though the digital push was slowed when McDonald departed. P&G could probably have lost little ground to competitors had it invested in digital in a more targeted fashion. Today it does so; no digital initiative is undertaken at P&G if it doesn’t fit the strategy closely and if it’s not hardwired to value. This digital governance discipline is an excellent idea given that the company is now under attack by a different corporate raider.

Finally, when things are not going so well in the existing business, the call of a new business model can become more powerful than it should. Many a person has allowed the excitement of a new relationship to destroy their stable, if less exciting, married lives. Similarly, the prospect of launching a sexy technology-based business is tantalizing. The allure of digital can become all-consuming, causing executives to pay too much attention to the new and not enough to the old. Sears’ investments in analytics were not a bad idea, but the company’s facilities and service needed investment more. Although Nike’s executive team was derided for shrinking the digital unit in 2014, the move allowed them to focus their continuing digital investments on higher-value activities. The company’s recent decision to cut staffing and product variety in the existing business, while continuing to improve digital sales channels, appears to be an effort to optimize across both.

There’s something different about technological change that causes senior executives in large, established firms to act differently than they might otherwise. When investing in a typical strategic change, managers are usually pretty clear about what they want to accomplish and what it will take to get there. There’s a lot of work to get things right, but they know where they’re going and how to measure progress. If the indicators move in the wrong direction, they can take action to set them on the right path, or can make the choice to de-escalate the investment.

With innovative information technology, however, executives sometimes lose their rational decision approaches. Certainly it’s true that in times of radical technological change there’s a lot of figuring out to do. Executives have to understand what new technologies can do, and understand their impact on markets, products/services, and distribution channels. These decisions are inevitably influenced by hype from vendors and the media, expensive consultants offering “thought leadership” insights, many high-profile experiments, and a few exciting success stories that keep people wanting more. A charismatic CIO or Chief Digital Officer may make it even harder to be level-headed in those heady times.

Amid the excitement and uncertainty of a new technological era, it can be very difficult to distinguish between investments you need to make ahead of the market and investments that must be in sync with market readiness. As a CEO it can be tempting to think about the early phases of radical technological change as a chance to dominate a new market rather than learning about the market. Investing ahead of the curve makes sense when we know what the curve is. But with digital transformation there’s a lot of exploration and understanding to accomplish before the curve starts to take shape.

When digital investments don’t quickly pay off, CEOs can feel that the issue they’ve encountered is about not spending enough, rather than the company (or the market) not knowing what the end state actually looks like. They can fear that reducing a highly public commitment to the new business could be seen as failure rather than smart decision-making. They may double down on their chosen strategy rather than pivoting toward the profitable approach, hoping to bully the market rather than learn about it.

In time, markets learn more about what they want, producers learn how to deliver it, and the way forward is clearer than it was before. At this point, it is much easier to make clear-headed decisions about digital. But funding a “big digital” strategy during the figuring out process can take more patience than investors have.

Of course, not all companies with short-term digital indigestion are making bad decisions. In e-commerce, what started 20 years ago as a radical innovation — and then radical destroyer of market value — is now standard practice in every industry. The leading companies, even those that made large unprofitable investments early in the transition, were able to pivot toward more profitable e-commerce strategies. In digital, as with future waves like IoT, AI, and conversational commerce, executives would be wise to be wary about the siren call of new technological innovation. Instead of ramping up quickly, only to ramp down painfully, it would be much better if companies can make steady progress toward the right end state without making such costly mistakes.

15 of the most important women in tech who changed the world – Mashable

The first programmers weren’t men, and the first computers weren’t machines. What they were, in both cases, were women.

Women’s many contributions to technology are frequently left out of the history books. But lately, that’s been changing — at least a little.

Ada Lovelace, considered the first computer programmer and a visionary for what programming and computers could eventually become, has a technology award named after her, and a holiday devoted to celebrating her legacy. Katherine Johnson meanwhile, the NASA “computer” responsible for successfully plotting the flight paths of some of America’s earliest space exploration expeditions, was the subject of the Hollywood blockbuster Hidden Figures (and the book it’s based on).

Katherine Johnson, whose calculations enabled some of NASA’s first space exploration, was portrayed by Taraji P. Henson in ‘Hidden Figures.’

Image: NASA/Donaldson Collection/Getty Images

But the stories of far too many of the women who drove innovation in the 19th, 20th, and into the 21st centuries — these key technological architects of modern life — have long gone unheard, their praises unsung. What about the woman who created the Palm Pilot, the woman who made working from home a reality, the woman who invented online dating, or the woman who helped Obama save the internet? (Yes, they were all women.)

In honor of International Women’s Day, here are 15 great ladies of technology you really need to know about.

1. The women who cracked the secrets of the universe with computation: Williamina Fleming and the Harvard “Computers”

In the late 1800s, men at the Harvard College Observatory were busy gazing at the sky through telescopes, gathering data about the stars and the planets. But what to do with all this raw information?

The head of the Observatory, Edward Pickering, needed someone to crunch the astronomical numbers in order to calculate relationships and effectively measure the universe. Men reportedly turned down their noses at this “clerical” work. So Pickering asked his housemaid, Williamina Fleming, to work as a “computer” at Harvard.

Fleming agreed, going on to lead a team of more than 80 women who did the computational work that’s responsible for how we understand the universe today.

2. The first computer programmers: The Women of ENIAC

The idea that computation and programming was tedious women’s work extended into the 20th century (right up until men found out how cool it was).

In the first half of the 20th century, Harvard’s “computers” grew into a unit of female mathematicians at what would become NASA and its Jet Propulsion Laboratory, working during World War II on behalf of the U.S. Military. The calculations they did plotting ballistic trajectories were time consuming and exceedingly complicated. Two men decided to build a machine that could carry out these calculations. It was called the ENIAC, and it’s now considered the first electrical computer.

But it was the women mathematicians who actually programmed the ENIAC. The ENIAC builders recruited six women who became the world’s first coders, manipulating the ENIAC to calculate missile trajectories.

For many years, people thought the women in ENIAC photos were models. Nope, they were the women actually doing the programming.

For many years, people thought the women in ENIAC photos were models. Nope, they were the women actually doing the programming.

Image: CORBIS/Corbis via Getty Images

The work they did for the army in the 1940s resulted in the first software program, the development of computer memory and storage, and the beginnings of programming language.

3. The ‘mother of computing’: Grace Hopper

“The mother of computing” also got her start in the military. In the late 1940s, Grace Hopper worked at the Harvard Computation Lab as part of the Navy Reserve, programming the Mark 1 computer that brought speed and accuracy to military initiatives.

Later, she transferred to the Eckert-Mauchly Computer Corp, where she worked as a senior mathematician. She helped develop the UNIVAC I computer, the first business-oriented machine. Her accolades include creating the first compiler: software that translates arithmetic into language and unifies programming instruction. She was one of the architects of a “new compiled computer language” called COBOL, which is still a standard of data processing today. Most notably, she’s credited with the idea that computer code could be written and read like language.

4. The woman you have to thank for hybrid car batteries: Annie Easley

Annie Easley made the jump from “human computer” to computer programmer while working at the mid-century agency of what would become NASA. Running simulations at a freaking “Reactor Lab,” she was one of only four African-American employees. She is well known for her work encouraging women and people of color to enter STEM fields.

Computer scientist, mathematician, and space scientist Annie Easley.

Computer scientist, mathematician, and space scientist Annie Easley.

Image: Smith Collection/Gado/Getty Images

Later, her work as a programmer involved energy conversion systems. According to NASA, she “developed and implemented code” that led to the development of the battery used in the first hybrid cars. You’re welcome, Prius drivers.

5. The person who pioneered the gift that is ‘WFH’: Mary Allen Wilkes

Not only did Mary Allen Wilkes helped develop what is now considered the first “personal computer” — she was also the first person to have a PC in her home. Wilkes worked on the LINC computer as a programmer and instructions author. She is credited with writing the LINC’s operating program manual, and she was also the programmer of the LAP6 operating system for the LINC. In a 2011 interview, she revealed that she actually took the LINC home with her in order to write the operating system, helping to make working remotely a reality for so many of us today.

6. Her work inspired Steve Jobs’ creation of the first Apple computer: Adele Goldberg

Without this woman, the Apple desktop environment might not look the way it does today.

Goldberg was a researcher at the Xerox Palo Alto Research Center (PARC) in the 1970s. She was the lone woman among a group of men who, together, built the Smalltalk-80 programming language and developed the infrastructure and design for overlapping windows on display screens, or “Graphical User Interface” (GUI).

An exterior view of the Xerox Palo Alto Research Center (PARC) in Palo Alto, California

An exterior view of the Xerox Palo Alto Research Center (PARC) in Palo Alto, California

Image: XEROX/Getty Images

In the PBS TV show Triumph of the Nerds, Goldberg revealed that she was forced by her superiors to show Smalltalk and the GUI to Steve Jobs and his team, even though she thought it wasn’t a good idea to show Jobs their intellectual property. In the same show, Jobs said he was transfixed by Smalltalk, and that he knew the GUI technology Goldberg had helped developed represented the future of computing, and of Apple.

7. The woman who basically invented online dating: Joan Ball

Unsurprisingly, a group of men at Harvard get credit for the first computerized dating service, called ‘Operation Match.’ But it was actually a woman in England who first devised a way to determine compatibility using a computer.

Joan Ball founded and ran the St. James Computer Dating Service, which she later re-named Com-Pat (short for “computerized compatibility). She translated survey answers about what a prospective lover did not want in a partner to punch cards, which she ran through a time-shared computer. Her program would reveal the “match” in the system, and people using the service would receive the name and address of whoever they had been paired with. She made the first match-by-computer in 1964 — a year before Operation Match at Harvard was up and running. So, Tinder and OkCupid users, you really have Joan Ball to thank.

8. ‘Google-ing’ something would never have occurred to men without her: Karen Spärck Jones

The search engines we use daily rely on the natural language processing discoveries made by one female computer scientist, Karen Spärck Jones. She was recruited to Cambridge into the “Language Research Unit” by another female professor, the computational linguist Margaret Masterman.

Jones’ most notable achievements laid the groundwork for the sort of information retrieval we use today. She introduced the use of thesauri into language processing, allowing for computational recognition of similar words. And she also introduced the idea and methods of “term weighing” in information retrieval, which helped queries determine which terms were the most relevant.

9. Before there was GoDaddy, there was this woman: Elizabeth “Jake” Feinler

Before it was called the internet, the ARPAnet was just a series of nodes, overseen by the Department of Defense, that connected several research institutions. The Stanford Research Institute was the “node” that oversaw the entire directory of the fledgling internet, through the “Network Information Center” (NIC). And the NIC was run by a researcher named Elizabeth (Jocelyn) Feinler, who more commonly went by “Jake.” (As a child, Feinler’s little sister’s pronunciation of her name, “Betty Jo,” sounded like “Baby Jake” — hence the nickname.)

If you needed a web address in the early days of the internet, you had to go to this lady.

If you needed a web address in the early days of the internet, you had to go to this lady.

Basically, Feinler’s outfit was the human Google, the organizational white and yellow pages of every domain on the internet. And if you needed to retrieve an address, or register a new one, you asked Jake. Feinler eventually helped the SRI transition to the Domain Name System (DNS); she helped introduce domain naming protocol, so we have Jake to thank for all the dot coms, dot nets, and dot govs out there today.

10. The person who made retro gaming awesome (before it was retro): Carol Shaw

If you love retro video games, thank Carol Shaw, who could have been behind some of your most cherished graphics.

Shaw is considered the first female video game designer and programmer. She is most famous for her 1982 game River Raid, but she also contributed to 3-D Tic-Tac-Toe (1979) and Video Checkers (1980), among many others. Her unpublished 1978 Polo is the first documented game designed and programmed by a woman. She was embedded in Atari from its earliest days, leaving an indelible mark on the video game industry.

11. Using Apple computers then and now was so intuitive because of her: Susan Kare

Building on the GUI inspired by Adele Goldberg’s team at PARC, the graphic designer Susan Kare is responsible for what remain some of Apple’s signature graphics to this day. First, she took on Steve Jobs’ directive to create a sleeker font for Apple — one that gave each letter its due amount of pixels, and didn’t attempt to make each uniform in the amount of space it took up (like a typewriter).

Kare also developed the idea that the graphics should be easily readable symbols, correlating to real world objects. This resulted in the Apple clock, the pointer finger, the trash can, and more. Even the Apple “command” key was of Kare’s design, inspired by a Swedish symbol for a castle.

Susan Kare’s work, featured in the Albuquerque “Startup” exhibition

Image: marcin wichary/flickr

12. She paved the way for the smartphone market: Donna Dubinsky

Before there was the iPhone, there was the Blackberry. And before there was the Blackberry … there was the Palm Pilot. Remember those?

The person responsible for introducing “personal digital assistants” (PDA) to the world was a businesswoman named Donna Dubinsky. Though built and prototyped by Jeff Hawkins, the Palm Pilot was brought to market by Dubinsky – an alum of Harvard Business School and Apple who built the first PDA company, Palm. After leaving Palm, Dubinsky founded Handspring, with its signature “Visor” PDA able to store data and access programs beyond a calendar and a few games. Sound familiar?

13. She helped Obama save the internet: Megan Smith

The White House’s third ever chief technology officer was a former Google VP named Megan Smith. Smith served as CTO under President Obama, helping to bring the U.S. government — parts of it reportedly still running on floppy disks in 2015 — into the 21st century.

Among other achievements, Smith closely advised President Obama on his decision to maintain net neutrality, and to endorse a free and open internet. She also created an online resource honoring and telling the stories of women in science and technology. And she strongly advocated for women’s inclusion in STEM fields.

14. The Marvel Cinematic Universe is awesome because of her: Victoria Alonso

The VFX industry is a notorious boys’ club, but one person who’s championed and innovated it from the beginning is the VFX producer Victoria Alonso.

Alonso is now Marvel Studios’ executive vice president of physical production. She has overseen the effects for many of the movies in the Avenger series, Guardians of the Galaxy, and many more. Alonso started her career as a production assistant, working her way up to be one of three top dogs at Marvel. She’s a boss lady if we’ve ever seen one.

15. Tech is more inclusive than ever thanks to her: Angelica Ross

After spending the first two decades of her life harassed by colleagues andshunned by her family for her sexual and gender identity, Angelica Ross, a transgender woman, is now one of the leading advocates for transgender opportunities in tech.

She’s serving fierceness in more ways than one.

Image: Kimberly White/Getty Images for GLAAD

Ross is the founder of TransTech Social Enterprises, which focuses on “lifting people out of poverty” through social work and technical training, and helping gender-nonconforming people get opportunities in technical roles. Not only is Ross a trailblazer herself — she’s paying it forward.

Cheers to that. Happy International Women’s Day!

Cybersecurity, AI top technologies for healthcare firms: Report –

BENGALURU: Big Data analytics and Artificial Intelligence (AI) are among the top technologies employed by healthcare and life sciences firms across the world, a report by global software major Infosys said on Monday.

“Cyber security (77%), Big Data analytics (72%) and AI (59%) are the three digital technologies most utilised by healthcare firms currently,” Infosys said in its report titled “Digital Outlook for Healthcare and Life Sciences Industry”.

The survey had included companies from seven countries across the world, including India, however, Infosys did not name any other countries and firms that took part in it.

The motive of the study was not just to identify the technology trends in healthcare and life sciences industries, but also to understand how technology improved their operations, Infosys said in the report.

Redemption for the DMV – Government Technology

On Jan. 31, 2013, California pulled the plug on a long overdue modernization of its Department of Motor Vehicles (DMV) IT systems. Originally scheduled to be completed that March, the state and its prime contractor had only managed to finish one portion of the $208 million system overhaul.

But California was not the only state to stumble. From coast to coast, DMVs struggled to drag their IT systems into the 21st century. Instead of creating new efficiencies for driver’s licensing and motor vehicle registrations, states were reporting false starts, failures and lawsuits. Meanwhile, as Americans were growing used to one-click, online orders for retail purchases, they found themselves still heading down to the local DMV office to stand in line, where wait times were lengthening while IT upgrades languished.

Worse still, the deadline for Real ID, the law passed by Congress requiring states to follow federal security standards when issuing licenses, was fast approaching. After years of fighting, all states finally agreed to meet the act’s compliance standard by October 2020. That meant every state DMV had to have the capability to verify that a license applicant was in the country legally and to verify with biometrics the authenticity of the person applying for a license or ID card. Try doing that with technology that’s 25 years old — or older.

But five years after California, the nation’s largest state by population, halted work on its DMV modernization, the landscape looks much different. A growing number of DMVs have modernized, pulling the plug on legacy systems, which have been replaced by integrated platforms that have re-engineered business processes while offering customers faster counter service or, even better, the opportunity to conduct transactions online, eliminating the need to visit the DMV office. The new systems have made it simpler to comply with Real ID while also making it easier to addemerging technologies.

So, what happened? Why have so many state DMVs gone from being the laughingstock of IT to showcases of modernization?

When asked how old New Mexico’s legacy DMV technology was, Alicia Ortiz, acting director of the Motor Vehicle Division (MVD), responded that it was built in the late 1970s, a time when mainframes were still new technology, Jimmy Carter was in the White House and the Internet was nothing more than an academic experiment funded by the Department of Defense. Ortiz admitted it was probably the oldest and worst-performing IT system among state DMVs. But New Mexico’s was not the only agency that had clung to old tech. Scores of states have kept their big iron technology operational, despite the growing maintenance costs and shrinking resources, not to mention the lack of skilled workers who could code in COBOL, the computer language that runs mainframes.

Drivers’ Licenses Go Digital

At the same time as DMVs move toward Real ID compliance, five states and the District of Columbia are moving forward on mobile or digital drivers’ licenses in 2018. The hope is that app-based, encrypted licenses and electronic technology to read them will cut costs, create efficiencies and increase safety for both citizens and law enforcement. While there’s no sign that high-tech licenses will replace the traditional form anytime soon, governments are seeing the advantages of a digital option.

Leading the pack is Iowa, where in 2015 the state piloted a 90-day mobile driver’s license project with identity company MorphoTrust USA, now IDEMIA, and is now working to expand the project statewide, making the tech available for both iOS and Android smartphones.

And in a pilot with digital security company Gemalto, with the help of a two-year grant from the National Institute of Standards and Technology, five jurisdictions — Colorado, Maryland, Wyoming, Idaho and Washington, D.C. — are exploring practical ways to implement digital driver’s licenses. Phase 1 of the project in 2017 looked at use cases, such as presenting app-based IDs to purchase lottery tickets or buy alcohol at sports events. Phase 2 in 2018 will expand to cases that include “attribution sharing,” where a user can provide a trusted third party, like a car rental company, with personal information from their digital license.

Ironically, the problem was that mainframes were well-built and designed to last, said Frank Dean, head of marketing and customer relations at Fast Enterprises. “Mainframes have always been solid and reliable,” he said. “The problem is that they are limited in what they can do, and now DMVs have reached the capacity for change that mainframes can handle.”

Nothing highlighted that capacity problem better than Real ID, according to Dean. “Real ID is a great example of something that comes along that the mainframe isn’t designed to handle,” he said. “It’s another reason why states are deciding to modernize and look at other options.”

A third factor in favor of modernizing is the impact new technology is having on the market and customers. Call it the “Amazon effect.” Drivers now expect online services, and DMVs are finally responding. “DMVs are moving toward more online transactions, where the customer doesn’t have to come down to the DMV in person,” said Ian Grossman, vice president for public affairs at the American Association of Motor Vehicle Administrators. “It’s become a big push in many states, with an emphasis on cutting down on visits through Web services or the mail.”

For early adopters, however, the transition from mainframe to modern would prove painful. Rhode Island began modernizing back in 2008, but the state suffered a

series of setbacks and delays, trading lawsuits with contractor DXC (originally Hewlett-Packard) before launching its new system last July. Walter “Bud” Craddock, administrator for the state’s Division of Motor Vehicles, attributed the troubles to the use of a traditional waterfall implementation method.

“It did not go well,” he reported. “But once we switched to an agile methodology, we made progress.” Today, the state has a Web-based system that is fully integratedand compliant with Real ID.

In 2008, the waterfall methodology, which has reigned over IT implementations for years, was considered the status quo. It also has been the source of many big system failures in government. Unfortunately what state DMVs want these days is a highly configurable system, not something that is going to take months to customize and build, according to Frank Dean.

Along with trying to build more flexible systems, early adopters also faced the challenge of re-engineering business processes that were decades old. Further, modernization called for the integration of what were once separate driver’s licensing and motor vehicle systems. Trying to change culture and business, and to integrate disparate systems, while adding brand-new, online services, is not for the faint of heart. The result, said Dean, was a series of modernization projects that turned into battlefields. “There was carnage all over it,” he said.

To avoid the problems that plagued the early adopters, states began to take a more innovative and comprehensive approach to modernization. In Colorado, the Division of Motor Vehicles realized that if it was going to succeed, it had to focus on four key areas: the organization, the processes, the facilities and, finally, IT.

While technology would be critical to its success, the DMV began to gather more comprehensive metrics on how it served existing customers. “That allowed us to set down standards in terms of what we wanted to achieve,” said Michael Dixon, DMV’s senior director. The DMV ran two Lean projects to identify problems, eliminate waste, and demonstrate that it was willing to change both processes and culture. “That helped us in terms of getting the money to make the necessary technological changes,” said Dixon.

Colorado launched its new driver services in early 2017, part of a two-phase project that will be completed later in 2018. The new system will radically change how workers are able to access information and run transactions. One big benefit: No longer will staff have to memorize codes, which will result in less time spent in training and more time in front of customers. The new system is also designed to reduce errors and increase accuracy. For customers, the new technology will allow for more online transactions, which can reduce the number of visits to DMV offices.

What makes Colorado’s new system, known as DRIVES, unique, is that it’s the only DMV in the country to run as a software as a service. The entire platform for driver and vehicle services is hosted and maintained by the vendor, Fast Enterprises. Dixon said that by allowing its DMV system to operate in Fast Enterprise’s cloud, the state will have technology that is refreshed regularly — instead of 25 years with the mainframe system — and it won’t have to compete with the private sector to attract highly skilled workers who can maintain the software while keeping it secure.

According to Dean, DMVs are interested in having their software hosted in the cloud, but state laws restrict the location of where identity data can be stored, which limits its use. As a result, the company doesn’t push cloud as a solution. “We would rather give them what they want, what solves their problem, than push a trend.”

New Mexico’s MVD is another example of how a state stumbled with its first attempt to modernize, only to try again and succeed. In 2012, the state shut down a modernization project that had cost $5 million so far. But that setback turned into a valuable lesson, according to Ortiz, leading to a second effort that resulted in a fully implemented, integrated driver’s and vehicle services system by 2016.

MVD spent a considerable amount of time talking with other states, documenting business processes and checking business rules against state statutes to make sure there weren’t any gaps between policies, procedures and technology. More importantly, the agency invested considerable resources in data cleansing. “We had three separate systems and everything was out of sync,” said Ortiz.

The goal was to move away from the siloed approach for data management to an integrated platform that was more customer-centric. After investing $36 million, the state has a system that makes it easier to process transactions with a higher level of accuracy, while drivers can access a growing number of online services. “The new system has given us a lot more flexibility to respond to new technologies as they emerge, such as electronic titles or anything that relates to autonomous vehicles,” said Ortiz.

As DMVs modernize, interesting trends have begun to emerge. Ortiz and other DMV directors have hinted at how mobile technology promises to open up the field of licensing in ways not seen before. In August 2017, Radius Global Market Research and MorphoTrust USA released a report showing a majority of Americans were interested in having mobile driver’s licenses that would be available through a smartphone app. Several states, including Iowa, Maryland and Colorado, among others, have initiatives underway to test the capabilities of a mobile license.

The challenge will be in overcoming the complexity of providing security, so that the license can be validated in real time, according to Frank Dean. “The system on the back end has to be very responsive so that law enforcement can take pieces of information from the mobile license and validate it with the DMV in real time,” he said.

As DMVs modernize and improve their ability to accurately identify and verify a driver, they have emerged as the most reliable government agency when it comes to identification. Federal agencies, including the Social Security Administration, now use a program called DLDV (driver’s license data verification) to properly identify a person who needs to replace their Social Security card. According to Dean, states that have motor vehicle divisions operating within the department of tax or revenue will use an individual’s license number as a form of verification when issuing tax refunds, to reduce fraud.

“There are more and more transactions taking place online without people ever being in the same building together,” said Dean. “To have that level of trust in an economy that is becoming increasingly electronic means you have to have a way to verify that trust. The DMV is the only agency in the U.S. that still regularly sees people and records identity information about them.”

In essence, DMVs have become identity hubs for government agencies at every level. And without modernization of their IT systems, such a system of trust wouldn’t be possible or practical.

Tod NewcombeSenior Editor

With more than 20 years of experience covering state and local government, Tod previously was the editor of Public CIO, e.Republic’s award-winning publication for information technology executives in the public sector. He is now a senior editor for Government Technology.

For IT chief, the goal is to deliver business value – TechTarget

Synonym for CIO: three words, 21 letters. What is it?

That’s valued business partner, said Hubert Barkley, the IT chief at Waste Industries, a garbage and recycling collection company in Raleigh, N.C.

His title is not CIO — he’s the vice president of information and technology — but he’s effectively “the CIO, the CSO, the director of IT and whatever else that they want me to do.” And his job is to understand the business side of the house and use technology to deliver business value.

He and his team are doing that through an assortment of tech projects. For example, the company uses analytics on maintenance data to determine which in its fleet of garbage trucks is going to need to be replaced. It also built a dashboard for the trucks that serves as a “coaching tool,” tracking how many bins workers picked up in a certain amount of time. So, if a driver who was supposed to pick up 700 in under 10 hours did it in 11, “we say, ‘Well, you did 11. What happened?’ They go, ‘Oh, truck broke down.'”

SearchCIO spoke with Barkley in December about how IT at Waste Industries is working to deliver business value. Edited excerpts are below.

How would you describe your role as an IT executive at Waste Industries?

Hubert Barkley

Hubert Barkley: I’m the lead visionary, if you will. I’m the CIO, the CSO, the director of IT and whatever else that they want me to do, because I’m also a — I’ll use the words valued business partner. I try to understand the business. I can speak the language of the people here, and I think it’s very critical for all IT leaders to understand the business from the business owners’ perspective — and then see how you can provide a solution that will make it successful for them, not to tell them how it’s going to be and you don’t understand what they need.

I had a conversation with my COO this morning. I said, ‘You’re responsible for the operations of this company, and you probably look at Waste Industries and say this is an operational company that picks up garbage.’ He goes, ‘Absolutely.’ I said, ‘Well, I look at it differently. I look at this as an information technology company that happens to pick up garbage. And the distinction is my job is to find leading-edge, cutting-edge technology that will allow us to do our jobs more efficiently, effectively and bring better value to our business and our investors.’ I think all CIOs need to think in this realm, regardless of what they’re doing.

My job is to find leading-edge, cutting-edge technology that will allow us to do our jobs more efficiently, effectively and bring better value to our business and our investors.
Hubert Barkleyvice president of information and technology, Waste Industries

What are some examples of how you’re using technology to deliver business value?

Barkley: We use predictive analytics to determine when we’re going to replace our trucks — in other words, we determine the life of a truck based off of its maintenance history and the environment that it runs in. So, we have multiple different kinds of trucks, and I can tell you, ‘Hey, in year seven you’re going to want to replace truck X. Because after that, even if you put a new engine and transmission in it, you’re going to get diminishing returns.’

We also do geospatial analysis. We’ll look at census data to decide where we should focus on the sales of our business based on our route density. So, if you’ve been in an area and you say, ‘Hey, here’s an area we service and, oh, by the way, the census data is telling us there’s a lot of new construction over here. There’s a lot of business over here.’ We’ll just route people to go to those places and focus on those, because for us it’s all about the density: The more we can pick up in the least amount of miles we drive is more profitable for us.

We also look at dynamic route optimization. That’s where we’re kind of like UPS. We want our trucks to run the most efficient route and the least amount of time picking up the most waste, and then we can measure that. For example, we’ve created a driver dashboard, which is a coaching tool, where we can say, ‘Hey, you were supposed to pick up 700 cans in nine and a half hours. You did it in nine and a half hours. That’s fantastic.’ If they did it in 11, we say, ‘Well, you did 11. What happened?’ They go, ‘Oh, truck broke down.’ We can do exception-based coaching, so if everybody’s doing good, you don’t need to talk to them and this thing will point out the ones who may need to be talked to or a little more information found out.

Learn about the copy data management software project that’s helping move data across a hybrid cloud-data center architecture in part one of this two-part interview.

Baltimore city government is developing a five-year tech transformation plan – Baltimore

The Baltimore city government CIO’s office is working on a new roadmap to modernize IT.

On Tuesday, the city released a draft of a new document called the “Inclusive Digital Transformation Strategic Plan.” The city says it’s the first of its kind for Baltimore.

When it comes to city government technology, there have been some bright spots in recent months, such as the community partnerships in the TECHealth program run by the Baltimore City Health Department and modernization moves to put permits online.

But the plan makes clear that a more fundamental overhaul stretching across all of city government is needed. Its development comes after Mayor Catherine Pughappointed former Intel exec Frank Johnson as the city’s CIO in September, who took the helm after a series of resignations in the city’s top tech job since 2012.

As a result of decentralized management and underfunding, “many of the city’s IT capabilities are outdated and lack the modern-day range of capabilities offered by comparable cities,” the report states. The plan seeks to lay out paths to turn that around.

The report states plainly that it’s not meant to be the final call.


“This document is not meant to detail the exact tasks necessary to implement various tech initiatives, but to simply outline the roadmap necessary to establish a tech ecosystem that reduces redundancy and cost, aligns standards, improves the public’s experience with city government and dismantles the digital divide,” the plan states.

The release on Tuesday isn’t a final draft either. The city is accepting public comments through March 16 on this website, with a final version expected to be released in April.

Read the draft plan

Even as it’s not finalized, a few key points are worth noting. One big change would be a rebrand. The plan calls for the office that’s been known as the Mayor’s Office of Information Technology to be renamed the Baltimore City Office of Information and Technology. That change would come with a new effort to centralize IT operations where appropriate.

Another focus is around updating the systems used by the city. A big section involves modernizing technology and practices, from new cloud services to introducing DevOps to the city. Developing new civic tech, including open data infrastructure and IoT, is also a focus.

Throughout, the plan also calls for the city to establish partnerships with the community. A big priority is around the city’s tech workforce, including a proposal for a city-run effort to develop a “pipeline of Baltimore-based IT talent.” Through this program, the city wants to create more tech training by partnering with existing organizations and companies, as well as public schools and colleges.

The plan also proposes creating a physical tech center with corresponding digital platform where people from inside and outside city government could work on new solutions.

There’s also a budget ask. Over five years, it calls for essentially doubling the current IT budget of $56 million.



Already a member? Sign in here

How technology became IndiGo’s passport to profitability –

If the gauge of an IT leader is the business value he or she delivers to the organization, Stephen Tame hasn’t done badly. During his stint at IndiGo he has exploited a generation of competitive technologies like big data, analytics, IoT and mobility to keep IndiGo soaring high well into the future.

Tame landed at IndiGo in the summer of 2014. As the Chief Advisor IT & Chief Digital Officer, he had his work cut out for him: implement IT to catalyze business advantage and chart strategic direction. But that was not nearly enough.

As the digital custodian of the largest domestic low-cost carrier, he had to embark on a multidimensional effort to reengineer its core business applications to help digital permeate through them. Net result: creating business value through digital initiatives.

And Tame was undeniably the right man or the job. He had logged miles of experience in the airline industry, including a decade long stint with Jetstar Airways as its CIO, Head of Group Information Technology. This breadth of experience helps him weigh business objectives and apply innovative solutions to realize them.

What is 21st Century Cures Act? – Definition from – TechTarget

The 21st Century Cures Act is a wide-ranging healthcare bill that funds medical research and development, medical device innovation, mental health research and care, opioid addiction treatment and prevention, and health information technology. The legislation provides, over 10 years, $4.8 billion to the National Institutes of Health over 10 years, $500 million to the Food and Drug Administration and $1 billion in grants to states to fight opioid addiction.

The bill, known as “Cures,” was approved by large bipartisan majorities in both the House and the Senate and was signed into law by President Barack Obama on Dec. 13, 2016. The Cures bill also significantly loosens FDA regulation of the development of pharmaceutical drugs and advanced medical devices and eliminates FDA regulation of low-risk health apps.

However, the act makes funding of most of the provisions contingent on Congress’ reallocating money for them each year. This mechanism helped secure support from fiscally conservative lawmakers who were worried that the spending bill would further strain the federal budget deficit.

Several high-profile critics among the 31 lawmakers who didn’t vote for the bill, including Democratic senators Bernie Sanders of Vermont and Elizabeth Warren of Massachusetts, criticized the bill as a giveaway to the pharmaceutical and medical device industries.

Mental and behavioral health advocates say the bill is the first major advance in a decade in funding research and treatment for people with mental illness and intervening in the early stages of psychosis, a promising development.

Some critics, though, argued that the act went too far in relaxing some privacy provisions in the name of better treatment.

Goals of the legislation

Congress intended Cures – with its broad reach across medical research and development, drugs and devices, mental health and health IT – to be a definitive step toward modernizing the U.S. healthcare system and recognizing the central roles of technology and science.

The bill also recognizes the importance of and provides funding for cutting edge research projects such as the Precision Medicine Initiative, BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies), Regenerative Medicine Innovation Project and Cancer Moonshot.

The 21st Century Act FDA language includes measures to streamline the design of clinical trials and expedite the approval of medical devices that have demonstrated potential to treat unmet medical needs and life-threatening conditions.

Impact on health IT

One of the 21st Century Cures Act interoperability provisions was among the law’s first to be put into action, with the January 2018 announcement by the Office of the National Coordinator for Health Information Technology (ONC) of the Trusted Exchange Framework.

The law includes specific language directing ONC to set up the framework – a nationwide system for sharing health data among networks run by healthcare systems, health information exchanges, insurers and other healthcare organizations.

Some of the law’s other health IT interoperability measures clarify ONC’s authority to certify health IT software to ensure that electronic protected health information is transferred securely and patients have unfettered access to their own health data.

Cures also gives the government authority to bar vendors and healthcare organizations from the practice known as information blocking, or impeding the flow of health data among healthcare providers, networks, vendors and patients.

Impact on mental health

The 21st Century Cures mental health impact is considerable, according to the American Psychiatric Association.

Among other provisions, the legislation:

  • Establishes a new position, the assistant secretary for mental health and substance abuse, intended to coordinate fragmented mental resources across the federal government
  • Creates another new position, chief medical officer at the Substance Abuse and Mental Health Services Administration (SAMHSA)
  • Requires SAMHSA to develop a strategic plan every four years to better recruit, train and retain mental health and substance abuse disorder workers
  • Reauthorizes grants to support integrated care for mental and behavioral health, train mental health workers in evidence-based care, and fund college, university and professional programs to expand internships and field placement programs
  • Strengthens enforcement of the Mental Health Parity and Addiction Equity Act of 2008, which mandates that insurers treat mental and physiological health issues equally


The bill was introduced on Jan. 6, 2015 by Rep. Suzanne Bonamici, D-Ore.

The House approved the bill on Jan. 7, 2015 and Senate approved it on Oct. 6, 2015 with an amendment and sent it back to the House, which agreed to the amendment on Nov. 30, 2016.

On Dec. 7, 2016, the Senate agreed to a new House amendment, and on Dec. 8, 2017 the bill was presented to the president, who signed it into law five days later.

Congress should close the loophole allowing warrantless digital car searches – TechCrunch

Most Americans expect the Fourth Amendment — which protects individuals from illegal searches — to extend to their digital lives.

In general, this expectation matches reality: unless law enforcement comes knocking with a warrant, the government cannot search a person’s phone or computer. However, cars are treated differently, and as “connected cars” become increasingly linked to people’s digital identities, there is a risk that police will use this exception to conduct digital searches without warrants.

Congress should close this loophole.

The Fourth Amendment is the cornerstone of people’s right to privacy and freedom from government intrusion in the United States. It requires the government to get a warrant based on probable cause before conducting a search and seizure of personal property.

The Supreme Court has found these protections important enough to update them for the digital world. For example, the court has extended warrant protections to cell phones and vehicle GPS tracking, and it is currently reviewing whether law enforcement officials should be required to get a warrant to obtain cellphone location information from wireless carriers.

However, there has been a long-standing exception for vehicles in the Fourth Amendment: law enforcement officials can stop and search a vehicle based on probable cause without having to get a warrant from a judge.

Photo: Joseph C. Justice Jr./Getty Images

For example, police officers can stop a vehicle for a routine traffic violation, and search it on the spot if the officers have probable cause that they will find contraband or the evidence of a crime. This lower standard for government searches makes sense in a physical world, where vehicles can only hold so much information and drivers can easily drive away to dispose of evidence.

But cars are changing, both in term of the amount and sensitivity of the information they can hold. Next-generation vehicles generate gigabytes of data while driving, enabling a host of new applications that enhance convenience, safety, and efficiency for drivers.

When this information can be accessed either through a display interface in the car or programmatically through an on-board computer, law enforcement could gain access to a significant amount of data about drivers without a warrant. For example, police could access in-car apps that contain sensitive information, such as navigation apps that contain travel history, social media apps that store messages and other personal information, and payment apps that contain information about past purchases.

While some of these applications require passwords, many only do so when the driver first logs in. Therefore, they would likely be unlocked when police pull over a driver.

In addition, many drivers may be intimidated into revealing their passwords during a stop, as has happened to travelers forced to unlock their phones at border crossings.

Finally, police could retrieve information stored in an on-board computer which may collect and store a variety of potentially sensitive information about drivers, including their driving behavior. Already, some police use special devices designed to circumvent built-in security measures on citizens’ phones and quickly copy their contents — similar devices could be designed for cars.

Photo: bjdlzx/Getty Images

Despite these potential risks, a car’s ability to collect information is not inherently privacy-invasive. And importantly, the automotive industry has taken pains to protect consumer privacy. For example, automakers made a series of public commitments in 2014 to establish strict privacy standards for data collected from vehicles, promising not to share consumer information with other businesses without affirmative consent — a standard that is higher than those found in other industries.

However, the auto industry cannot change the laws on digital searches. Policymakers should close this loophole to protect both citizens’ rights and support for technological progress. Congress has previously acted to close loopholes created by technological change.

For example, the Electronic Communications Privacy Act (ECPA), which limits how law enforcement can access digital information has different legal standards for obtaining email stored on a PC and email stored in the cloud. As cloud computing adoption has grown, Congress has worked to pass a legislative fix.

Just as Congress has been working to close the loophole for cloud computing, it should close the loophole created by the convergence of digital technology with vehicles. Congress should require law enforcement officials to obtain a warrant before they can access data from a vehicle.

Congress can do this while maintaining the vehicle exception for physical searches and maintaining law enforcement’s access to data held by third parties, such as automakers or wireless providers, through warrants or other lawful processes.

By upholding citizen privacy, Congress can ensure a smooth road ahead for vehicles of the future.