Recruiters: Mind Your T’s and Q’s

Finding the right person for the growing range of new technologies can be difficult. There are skills shortages in a number of areas, and there is a rush to fill them with obvious talent. In this rush, however,  it is important to remember that the specific skills are of only transitory value, and requirements will shift as the next new thing comes along.

Digitization is rapidly converging traditionally disparate processes and technologies, creating a need for a new kind of worker. In focusing constantly on specific technical skills, we may be weakening the ability to understand the broader context that must fuel innovation. There needs to be input from the Arts, from global experience, and from the imagination. This demands a different type of learning.

In the late nineties, “T-shaped skills” were introduced, with the vertical bar representing dept of skills and the horizontal bar, the ability to work across disciplines. This was useful in a structured and deterministic world. But we are now in a time of vast changes, shifting skills requirements, and new pressures from robotics and AI. It’s time for a new, and complementary, concept.

Some years ago, I drew a cartoon about recruitment fads ( in reaction to “T-shaped” assumptions. In it, I introduced “Q-shaped” skills as “roundness of knowledge with a squiggly bit underneath.”

Although this was partly in jest, it does raise a significant point. In a converged world, “T-shaped” is no longer enough. Just as Steve Jobs drew from calligraphy for PC invention, exposure to a much wider range of knowledge is increasingly essential for innovation. Imagination and ingenuity are also at a premium.

Certainly, “T-shaped” skills will always continue to be important. But handling the growing possibilities of digital convergence creates a need for the nuanced “Q-shaped” skills that focus upon the big picture and its imaginative possibilities.

We have already seen how over-emphasis upon rote learning and tests can increase “T” and diminish “Q” skills. Companies lacking in the former will have trouble meeting the needs of the moment; companies lacking in the latter will fail to envision  the opportunities of the future.

We need to increase our “Q” skills to create the Total Quality workforce of tomorrow.


Digital Transformation, an Evolving Vision: The Video

Digital Transformation is viewed by many companies as a strategic necessity, these days. It can energize a business, build efficiency, reduce cost, and open the way for innovation. As with many concepts, however, it has become diffuse in meaning and marketing strategies tend to focus upon a few simple areas in which digitization can help. What is missing is a greater story of the impact, implications, and the actual consequences of undertaking such a transformative process.

Previous blogs on this subject are: Digital Transformation at the World Economic Forum: The VideoDigital Transformation: The Video, and Guest Blog: Lagging In Digital Transformation? 30% Of Your Senior Executives Are Going To Leave Within A Year.

We have now reached a point where most companies understand that this is coming and have programs in place for digitization of some aspects of their business. Now we need to look more closely at “Digitization 2.0,” where we begin to consider the opportunities and effects that have emerged from the innumerable successes and failures in this realm.

Here we have assembled a few videos on broader aspects of Digital Transformation. Most are standard YouTube licenses with explanation drawn from landing pages. The CEB/Gartner video is a share from the company’s site, with description from that page.

The case for Digital Reinvention (McKinsey & Company)

Published on Feb 28, 2017

Digital technology, despite its seeming ubiquity, has only begun to penetrate industries. As it advances, the implications for revenues, profits, and opportunities will be dramatic. Here we explore the results of our survey into digitization across industries and detail the case for digital reinvention.

The New IT Operating Model for Digital (CEB Global – Gartner)

Two-thirds of business leaders believe that their companies must pick up the pace of digitization to remain competitive. The pace and volatility of digitization opportunities, as well as blurred technology responsibility across the enterprise, makes it more difficult for IT leaders to help their organizations exploit emerging digital opportunities.

To meet these challenges head on, progressive IT leaders are changing IT’s operating model. We have identified nine features of the new operating model that will position IT teams for digital success.

The Digital Transformation Playbook (Columbia Business School)

Published on Jun 3, 2016

BRITE founder, author, and Columbia faculty member David Rogers talks at BRITE ’16 about how businesses need to transform by understanding that now: 1) customers are part of a network, 2) competition comes from platforms more than products, 3) data is a strategic asset, 4) innovation is driven by small experiments and scaling, 5) value is dynamic and adaptable. Get further insights and tools to make this transformation by reading his new book, “The Digital Transformation Playbook: Rethink Your Business for the Digital Age.”

The BRITE conference on brands, innovation and technology is hosted by the Center on Global Brand Leadership at Columbia Business School.

Digital Transformation of Society, Business (Gerd Leonhard

Published on Apr 5, 2017

A talk by futurist Gerd Leonhard.

“This is the edited version of my keynote at DST Systems Advance conference in Phoenix Arizona. This talk is about digital transformation (challenges and opportunities) – the next 5 years. You can download the slidedeck use in this talk via this link (PDF)”



Digital Transformation at the World Economic Forum: The Video

Digital Transformation and the convergence of digital technologies will become increasingly critical in the years to come, and it is important to understand what this means for people and industries–and how it is likely to impact you, both on the job and at home. It is an essential part of planning for business and government, and dominates business reorganization conversations.

Digital Transformation is also a very broad area, with enormous implications worldwide, and innumerable nuances both in understanding and in implementation. A video summation is a good way to focus on implications that might not have been considered, and the World Economic Forum (WEF) has recently delivered a set of interesting examples.

This is a topic that we have covered frequently in this blog. Recent posts include:

The Fungible Digital Universe: Convergence in a New Key

Digital Transformation: The Video

Guest Blog: Lagging In Digital Transformation? 30% Of Your Senior Executives Are Going To Leave Within A Year

The Art of Digital Transformation INFOGRAPHIC

Guest Blog: A Very Short History of Digitization

The WEF has been following the impact of Digital Transformation for some years now, and has interesting insights from its global developmental perspective. The main video and animation series are excellent guides to where this stands in early 2017.

All of the following videos were released under the standard YouTube license.

Digital Transformation Initiative (Digital Transformation of Industries)

Uploaded on Jan 15, 2017

The World Economic Forum (WEF) launched its Digital Transformation Initiative (DTI) in 2015. The initiative offers insights into the impact of digital technologies on business and wider society over the next decade. DTI research supports collaboration between the public and private sectors focused on ensuring that digitalization unlocks new levels of prosperity for both industry and society.

Following is the launch video from the WEF site.

The WEF “Challenges of Digital Transformation” Animation Series

A series of explanatory animations based upon WEF scripts was produced in support of the initiative, by Mair Perkins and released in January 2017.

According to Perkins:

“Between November 2016 and January 2017 I worked on a series of animations for the World Economic Forum. The animations showcase some of the WEF’s research into the challenges of the digital transformation. It’s full of interesting facts about technology and predictions for the future.”

Research and script writing by the World Economic Forum;
Art Direction and storyboard design by Mair Perkins;
Illustration by Mair Perkins;
Animation by Mair Perkins, Sam Glyn Davies, Jack Pearson and Adam Horsepool;
Music and sound design by Ben Haynes.

Ep 1 of The Challenges of Digital Transformation – Animation for the World Economic Forum

Ep 2 – How Can Businesses Stay Ahead of Digital Disruption – Animation for the World Economic Forum

Ep 3: A Robot Will Take Your Job – Animation for the World Economic Forum

Ep 4 – Ensuring Digital Transformation Benefits Everyone – Animation for the World Economic Forum


The Fungible Digital Universe: Convergence in a New Key

With digital transformation so active in enterprise considerations, relatively little time has been spent in considering its implications. Digital transformation is often presented as a means of improving efficiency and creating processes that can be easily and flexibly integrated across the corporation. But one of the key issues in digital transformation is digital convergence.

In digital transformation, the vision is to transform all processes, designs, work products, services, and anything that might be so-converted into a digital form. Digital form makes instantaneous transmission over vast distances possible; integrates processes, and opens them to replication or posting within the cloud. The advantages are extraordinary and digital transformation has been underway for a very long time. But there is another factor in this conversation: the fact that all digital streams can easily be subjected to the same or similar processes, thereby opening the way for extreme innovation and cross-pollination, as diverse and heretofore entirely discrete fields are brought together.

As we move into an era of embedded artificial intelligence and big data analytics, digital convergence becomes an issue of extreme importance. The same processes that analyze and interrogate one digital stream might easily be applied to another; the algorithms used in deep learning, for example, easily move between image recognition and voice. Similarly, because they embody the same digital format, the same algorithms might be used to find patterns within programs; to analyze architectural drawings; to understand transactions; and to provide new forms of user access and data comprehension. Digital convergence becomes a kind of synesthesia, where the boundaries between objects, activities and functions become increasingly blurred.

Digital convergence as an issue first arose with telecommunications, and then with multimedia. These areas were early examples of how digitization erased the boundaries between dissimilar components. As everything became transmissible in digital form across the network, markets for multimedia changed; intellectual property rights became problematic; and the capability to copy and broadcast items such as movies and audio recordings became nearly infinite. Legal systems are still struggling to fit the new possibilities within social and legal frameworks and understandings. Now, with everything becoming digital, issues such as intellectual property, privacy, and segregation of one component of the universe from another become moot.

The world itself, as we know it, is a fabrication of knowledge whose definitions, patterns, and interactions we digest and share. It is an imperfect system since it is overlaid upon an existing physical reality; the vertices become sharp and apparent, and logical argumentation becomes less precise because definitions encompass our understanding of an item rather than the item itself. In a converged digital universe, the world we interact with is, in fact, the primary level. Language starts to stumble in its description, because all of those new concepts become terms whose meaning changes as swiftly as the territory shifts.

Digital transformation is the means by which we are moving to this more flexible, more fungible universe; it is an essential process for business which will reap immediate benefits in being able to act far swifter than any non-digital process or comprehension. But, in a greater sense, it will also change how we understand the universe and how future interactions will take place as human beings evolve toward assisted and hybrid man-machine thought processes.


Digital Transformation: The Video

Digital Transformation is critical to everything about the future of IT; and, increasingly, of everything else. By converting processes, work products, goods and services into digital formats, it is possible to transform them and manipulate them through software. The manipulation processes are, moreover, similar across the entire universe of digital data. Digitization means information and plans can be communicated instantly; processes can be transformed swiftly; and the whole spectrum of analytics and AI can be applied to the data stream.

Companies need to transform their analog processes into digital equivalents just to catch up with the present; integrating these processes into current software and services will permit them to competitively reach the future.

Here we have a number of videos focusing upon Digital Transformation, along with the explanations provided. These are all available under the standard YouTube license.

Davos 2016 – The Digital Transformation of Industries (World Economic Forum)

What big bets are companies making in the digital transformation of their business models and organizational structures?

On the agenda:

  • Defining digital transformation
  • Making the right investment decisions
  • Designing a digital culture


  • Marc R. Benioff, Chairman and Chief Executive Officer, Salesforce, USA.
  • Klaus Kleinfeld, Chairman and Chief Executive Officer, Alcoa, USA.
  • Jean-Pascal Tricoire, Chairman and Chief Executive Officer, Schneider Electric, France.
  • Bernard J. Tyson, Chairman and Chief Executive Officer, Kaiser Permanente, USA.
  • Meg Whitman, President and Chief Executive Officer, Hewlett Packard Enterprise, USA.

Moderated by Rich Lesser, Global Chief Executive Officer and President, Boston Consulting Group, USA.

(Creative Commons license; Published on Jan 29, 2016)

Digital Transformation – The Business World of Tomorrow (Detecon International)

Smart business networks, sensors, the Internet of Things, or teamwork which is distributed solely as virtual assignments – many of these innovations will result in substantial changes to business strategies as well. What has not changed, however, is the extreme complexity in the design of digital business models resulting from the frequent lack of transparency concerning concrete operational consequences in the corporate business units.

What capabilities does a company require for the realization of digitalized strategies and processes? What steps are necessary to become “Leading Digital”? And is all of this work even worth the effort? (Published on Jan 22, 2014)

What Digital Transformation Means for Business (MIT Sloan Management Review)

An all-star panel discusses digital transformation at Intel, Zipcar — and beyond. Panelists include Kim Stevenson (Intel Corporation), Mark Norman (Zipcar), Didier Bonnet (Capgemini Consulting), Andrew McAfee (MIT Center for Digital Business)

(Published on Jun 28, 2014)

Why Every Leader Should Care About Disruption (McKinsey & Company)

Digitization, automation, and other advances are transforming industries, labor markets, and the global economy. In this interview, MIT’s Andrew McAfee and McKinsey’s James Manyika discuss how executives and policy makers can respond.

The disruptive impact of technology is the topic of a McKinsey-hosted discussion among business leaders, policy makers, and researchers at this year’s meeting of the World Economic Forum, in Davos, Switzerland. In this video, two session participants preview the critical issues that will be discussed, including the impact of digitization and automation on labor markets and how companies can adapt in a world of rapid technological change.

Text from

(Published on Jan 4, 2015)

Leading Digital Transformation Now – No Matter What Business You’re In (Capgemini Group)

In this keynote session recorded at Oracle OpenWorld 2014, Dr. Didier Bonnet, Capgemini Consulting’s global head of digital transformation and coauthor (with MIT’s George Westerman and Andrew McAfee) of the upcoming book “Leading Digital,” highlights how large companies in traditional industries—from finance to manufacturing to pharmaceuticals—are using digital to gain strategic advantage.

Didier also discusses the principles and practices that lead to successful digital transformation based on a two-part framework: where to invest in digital capabilities, and how to lead the transformation. (Published on Sep 30, 2014)

Ring the Welkin: Everything is Related, and a New Era has Begun

We cannot ignore the outside world as technology continues to press forward. Technology affects economies, jobs, and public opinion. It will continue to inform aspirations, fears, and political dialogues around the world.

The impact of recent technology has been so profound that society has not really caught up with it. This is the “Cartoon Cliff” effect. People continue forward just as they always have, until that moment where they look down and discover the earth is so very far below. Perhaps with the US Election, that moment has arrived.

Society reacts. Hope for the best, plan for the worst, and remember that the outcome is far from certain. The past is not necessarily a guide to the future. But it is important to remain optimistic. Adjustments are always necessary, no matter how painful they might be. There will be new opportunities and new possibilities. Even as some doors begin to close, others will open. Changes in the geopolitical lineup will hasten trends that have been building for years.

Guest Blog: Lagging In Digital Transformation? 30% Of Your Senior Executives Are Going To Leave Within A Year

Original blog link:

Lagging In Digital Transformation? 30% Of Your Senior Executives Are Going To Leave Within A Year

by Gil Press

Companies lagging in their digital transformation or not even trying to become digital, face the risk of losing substantial portions of their sales, IT leadership, and senior management. About 30% of senior vice presidents, vice presidents, and director-level executives who don’t have adequate access to resources and opportunities to develop and thrive in a digital environment are planning to leave their company in less than one year.

This is one of the key finding of a new research report, Aligning the Organization for its Digital Future. It is based on a worldwide survey of 3,700 business executives, managers, and analysts, conducted for the fifth year in a row by MIT Sloan Management Review, in collaboration with Deloitte.

There is remarkable across the board agreement about digital disruption which 87% of those surveyed believe will impact their industry. This is considerably up from last year’s survey, where only 26% said that digital technologies present a threat of any kind. Regardless of the much-increased anticipation of digital disruption, only 44% think their organizations are adequately preparing for it. Similarly, a recent Gartner survey of IT professionals found that 59% said that their IT organization is unprepared for the digital business of the next two years.

“Digital” has a strong external orientation, according to the reported objectives of the digital strategy of the organizations surveyed. 64% “strongly agree” with improving customer experience and engagement as a key objective. Only 41% cite “fundamentally transform business processes and/or business model.”

While the orientation of companies’ digital strategy is primarily external, the perceived obstacles to digital success are primarily internal. The biggest barrier impending the organization from taking advantage of digital trends is too many competing priorities, followed by lack of organizational agility. “Disruption,” to these respondents, begins at home, not with the startups promising to disrupt their industry.

Understanding technology is a required but not the most important skill for success in a digital workplace. Says the report: “In an open-ended question, respondents said that the ability to steer a company through business model change is the most important skill, cited by 22%.” They also think that there are not enough people with the right skills. Only 11% say that their company’s current talent base can compete effectively in the digital economy.

The report goes beyond the raw data to assess “companies’ sophistication in their use of digital technologies.” Explaining the methodology for this assessment, it says:

“For the past two years, we have conducted surveys in which we asked respondents to “imagine an ideal organization transformed by digital technologies and capabilities that improve processes, engage talent across the organization, and drive new value-generating business models.” We then asked them to rate their company against that ideal on a scale of 1 to 10. Respondents fall into three groups: companies at the early stages of digital development (rating of 1-3 on a 10-point scale, 32% of respondents), digitally developing companies (rating of 4-6, 42% of respondents), and businesses that are digitally maturing (rating of 7-10, 26% of respondents).”

The assessment of whether a company is digitally mature or not is a subjective assessment by the respondents, not by outside observers applying objective criteria. It may well be that the respondents who rated their companies low on the digital maturity scale simply are not happy with their current employer—not enough opportunities to develop, generally incompetent leaders, too much hierarchy and not enough collaboration.

Notwithstanding the issue of how digitally mature companies were identified, the report’s conclusion—and prescription—is that to succeed in a digital world you must adopt a digital culture. It says:

“A key finding in this year’s study is that digitally maturing organizations have organizational cultures that share common features…The main characteristics of digital cultures include: an expanded appetite for risk, rapid experimentation, heavy investment in talent, and recruiting and developing leaders who excel at “soft” skills.”

Sounds to me very much like the prescriptions for business success emanating from business schools for at least half a century, way before “digital” has become a set of new technologies, processes, and attitudes companies must invest in and take advantage of to stay competitive.

The importance of becoming digital today is a good enough reason to read the report carefully and take note of how business executives in 131 countries and 27 industries answered the questions posed to them. The Sloan Management Review and Deloitte should be commended for conducting a large annual survey probing the state-of-the-art of digital transformation.

But for a more convincing assessment of what constitutes “digital maturity” we will have to wait until Sloan and Deloitte (or someone else) conduct research that compares objectively companies that have invested heavily in “digital” with companies that have invested only lightly in this new new thing. A difficult research challenge, no doubt, as very few companies willingly admit to falling behind the times.

The findings will be even more meaningful if the research will compare objectively successful companies (e.g., profitable) not investing in digital with not-so-successful companies (e.g., losing money, market share) that have totally embraced digital. Aren’t there out there today companies that are hierarchical, risk-averse, and do not invest in talent and digital but still that make a ton of money ? Can we be absolutely confident that these will not be the characteristics of (at least some) successful companies in the future?

Gil Press

Gil Press is Managing Partner at gPress, a marketing, publishing, research and education consultancy. Previously, he held senior marketing and research management positions at NORC, DEC and EMC. Hewas Senior Director, Thought Leadership Marketing at EMC, where he launched the Big Data conversation with the “How Much Information?” study (2000 with UC Berkeley) and the Digital Universe study (2007 with IDC). He blogs at and Twitter: @GilPress

Guest Blog: A Very Short History of Digitization

Original blog link:

A Very Short History of Digitization

Milestones in the story of the adoption and proliferation of today’s most widely spoken language, the computer’s binary code.

by Gil Press

Ones and zeros are eating the world. The creating, keeping, communicating, and consuming of information are all being digitized, turned into the universal language of computers. All types of enterprises, from small businesses to large corporations to non-profits to government agencies, are going through a “digital transformation,” turning digitization into new processes, activities, and transactions.

From the 1950s on, with a distinct bounce in the 1990s due to the advent of the Web, digitization has changed the way we work, shop, bank, travel, educate, govern, manage our health, and enjoy life. The technologies of digitization enable the conversion of traditional forms of information storage such as paper and photographs into the binary code (ones and zeros) of computer storage. A sub-set is the process of converting analog signals into digital signals. But much larger than the translation of any type of media into bits and bytes is the digital transformation of economic transactions and human interactions.

The expression of data as ones and zeros facilitates its generation, replication, compression, and dissemination (see A Very Short History of Big Data); its analysis (see A Very Short History of Data Science); and its organization (see A Very Short History of the Internet and the Web). It also encourages the replacement or augmentation of the physical with the virtual or online presence (see A Very Short History of the Internet of Things).

Here are a few milestones in the story of the adoption and proliferation of today’s most widely spoken language.

1679 Gottfried Wilhelm Leibniz develops the modern binary number system and, in 1703, publishes Explication de l’Arithmétique Binaire (Explanation of Binary Arithmetic), linking it to ancient China.
1755 Samuel Johnson publishes A Dictionary of the English Language and includes an entry for “Binary arithmetick,” quoting Ephraim Chambers’ Cyclopaedia: “A method of computation proposed by Mr. Leibnitz, in which, in lieu of the ten figures in the common arithmetick, and the progression from ten to ten, he has only two figures, and uses the simple progression from two to two. This method appears to be the same with that used by the Chinese four thousand years ago.”
1847 George Boole introduces Boolean algebra in The Mathematical Analysis of Logic, creating the field of mathematical logic, leading eventually to universal computation. In 1854, he writes in An Investigation into the Laws of Thought: “The respective interpretation of the symbols 0 and 1 in the system of logic are Nothing and Universe.”
1937 Claude Shannon submits his master’s thesis at MIT, establishing the theoretical underpinnings of digital circuits. Shannon showed how Boolean algebra could optimize the design of systems of electromechanical relays then used in telephone routing switches.
1938 Alec Reeves conceives of the use of pulse-code modulation (PCM) for voice communications, digitally representing sampled analog signals. It was not used commercially until the 1950s, when the invention of the transistor made it viable. PCM has become the standard form of digital audio in computers, compact discs, digital telephony and other digital audio applications.
1940 John V. Atanasoff writes in Computing Machine for the Solution of Large Systems of Linear Algebraic Equations, a paper describing the electronic digital calculating machine he has built with Clifford Berry: “…for mechanized computation, the base two shows a great superiority… a card of a certain size used with the base-two recording system will carry more than three times as much data as if used with the conventional base-ten system.”
1943 The SIGSALY secure speech system performs the first digital voice transmission, used for high-level Allied communications during World War II.
June 25, 1945 John von Neumann’s A First Draft of a Report on the EDVAC is distributed to 24 people working on the development of the EDVAC, one of the earliest computers. It documents the key decisions made in the design of the EDVAC, among them the decision to use binary to represent numbers, thus reducing the number of components required compared to its predecessor, the ENIAC, which used the decimal system. The document became the technological basis for all modern computers.
1948 Claude Shannon publishes “A Mathematical Theory of Communication” in the July and October issues of the Bell System Technical Journal. Shannon: “If the base 2 is used [for measuring information] the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.”
Summer 1949 Claude Shannon lists in his notebook the storage capacity in bits of a number of items. He estimated that a punch card has just under 103 bits and a single-spaced typed page 104 bits. Four years before the discovery of the double-helix structure of DNA, Shannon estimated that the “genetic constitution of man” is about 105 bits. The largest holder of bits he could think of was the Library of Congress which he estimated to hold 1014 bits of information (p. 232 in The Information by James Gleick).
1954 General Electric’s Major Appliance Division plant in Louisville, Kentucky, installs the UNIVAC I computer, the first business use—payroll processing and manufacturing control programs—of a computer in the United States. “The Univac I was also hooked up with speakers, and the operator had the machine playing classical music each evening,” recalls Burton Grad who designed and wrote (in machine language) a manufacturing control program for GE’s Dishwasher and Disposer Department.
1955 John Hancock Mutual Life Insurance Co., a pioneer in digitizing customer information, digitizes 600 megabytes of two million life-insurance policies.
September 4, 1956 IBM announces the 350 Disk Storage Unit, the first computer storage system based on magnetic disks and the first to provide random access to stored data. It came with fifty 24-inch disks and a total capacity of 5 megabytes, weighed 1 ton, and could be leased for $3,200 per month; its first customer was United Airlines’ reservations system.
September 14, 1956 IBM announces the 305 RAMAC and the 650 RAMAC (Random Access Memory Accounting) which incorporated the 350 Disk Storage Unit. It promised, as the IBM press release said, “that business transactions will be completely processed right after they occur. There will be no delays while data is grouped for batch processing… Random access memory equipment will not only revolutionize punched card accounting but also magnetic tape accounting.”
When it was exhibited in the 1958 Brussels World’s Fair, visitors could query “Professor RAMAC” using a keyboard and get answers in any of ten languages. The RAMAC became obsolete within a few years of its introduction as the vacuum tubes powering it were replaced by transistors. But disk drives, invented in a search for faster access to information, are still used as the containers for almost all digital information today.
1960 American Airlines’ Sabre flight-reservation system digitizes a process developed in the 1920s, processing 84,000 telephone calls per day and storing 807 megabytes of reservations, flight schedules and seat inventory.
1962 The term database is mentioned in print for the first time, according to the Oxford English Dictionary, quoting a Systems Development Corporation technical memo: “A ‘data base’ is a collection of entries containing item information that can vary in its storage media and in the characteristics of its entries and items.”
1963 Charles Bachman, at GE’s computer division, develops the Integrated Data Store (IDS), one of the first database management systems using what came to be known as the navigational database model in the Manufacturing Information and Control System (MIACS) product.
April 19, 1965 Gordon Moore publishes “Cramming more components onto integrated circuits” in Electronics magazine, the first formulation of what became to be known as “Moore’s Law.” The observation of the constant doubling of the number of transistors that can be “crammed” into an integrated circuit became the rallying cry that has guided manufacturing process innovations that have reduced the price and increased the power of electronic components and drove a constant expansion of the scope and reach of digitization.
1968 U.S. libraries begin using Machine Readable Cataloging (MARC) records.
1969 Willard Boyle and George E. Smith at AT&T Bell Labs invent the charge-coupled device (CCD), transforming light into electric signals. The CCD has played a major role in the development of digital imaging in general and the development of digital cameras and medical imaging in particular. Boyle and Smith were awarded the 2009 Nobel Prize in Physics.
June 1970 Edgar F. (“Ted”) Codd publishes “A relational model of data for large shared data banks,” in the Communications of the ACM, presenting the theoretical basis for relational databases, which became the dominant type of databases from the 1980s to around 2000.
1971 Arthur Miller writes in The Assault on Privacy that “Too many information handlers seem to measure a man by the number of bits of storage capacity his dossier will occupy.”
July 4, 1971 Michael Hart launches Project Gutenberg with the goal of making copyright-free works electronically available by entering the text of the U.S. Declaration of Independence into the mainframe he was using at the University of Illinois.
1972 Pulsar, the world’s first all-electronic digital watchand the first to use a digital display, is launched.
1973 Charles Bachman is awarded the Turing Award. From The Programmer as Navigator, Bachman’s Turing Award lecture: “Copernicus presented us with a new point of view and laid the foundation for modern celestial mechanics… A new basis for understanding is available in the area of information systems. It is achieved by a shift from a computer-centered to the database-centered point of view. This new understanding will lead to new solutions to our database problems and speed our conquest of the n-dimensional data structures which best model the complexities of the real world… The availability of direct access storage devices laid the foundation for the Copernican-like change in viewpoint… From this point, I want to begin the programmer’s training as a full-fledged navigator in an n-dimensional data space.”
December 1975 The first digital camera, invented by Steven Sassonat Eastman Kodak, takes 23 seconds to capture its first image. The camera weighed 8 pounds, recorded black and white images to a compact cassette tape, and had a resolution of 0.01 megapixels.
977 Citibank installs its first ATM. By the end of the year, all the bank’s New York branches had at least two machines operating 24 hours a day, seven days a week, ensuring 24-hour access in case one fails. When a huge blizzard hit New York in January 1978, banks were closed for days and ATM use increased by 20%. Within days, Citibank had launched its “The Citi Never Sleeps” ad campaign. A decade later, the bank’s ATM network stored 450 megabytes of electronic transactions.
1979 Federal Express launches COSMOS (Customers, Operations, and Services Master Online System), digitizing the management of people, packages, vehicles, and weather scenarios in real time, with a computer storage capacity of 80 gigabytes.
April 1980 I.A. Tjomsland gives a talk titled “Where Do We Go From Here?” at the Fourth IEEE Symposium on Mass Storage Systems, in which he says “Those associated with storage devices long ago realized that Parkinson’s First Law may be paraphrased to describe our industry—‘Data expands to fill the space available.’”
1981 Edgar F. (“Ted”) Codd is awarded the Turing Award for his fundamental and continuing contributions to the theory and practice of database management systems—“whenever anyone uses an ATM machine, or purchases an airline ticket, or uses a credit card, he or she is effectively relying on Codd’s invention.”
In his Turing Award Lecture, Codd notes that “As it stands today, relational database is best suited to data with a rather regular or homogeneous structure. Can we retain the advantages of the relational approach while handling heterogeneous data also? Such data may include images, text, and miscellaneous facts. An affirmative answer is expected, and some research is in progress on this subject, but more is needed.” The challenge of heterogeneous data or “big data” will be addressed almost three decades later but not with a relational database approach.
July 9, 1982 The movie Tron, in which the Jeff Bridges character is digitized by an experimental laser into a mainframe where programs are living entities appearing in the likeness of the humans who created them, is released.
August 17, 1982 The first commercial compact disc (CD) is produced, a 1979 recording of Claudio Arrau performing Chopin waltzes.
1984 8.2% of all U.S. households own a personal computer, the U.S. Census Bureau finds in its first survey of computer and Internet use in the United States. In 2013, 83.8% of U.S. households reportedcomputer ownership, with 74.4% reporting Internet use.
Februrary 1985 Whole Earth’s ‘Lectronic Link (WELL) established, one of the first “virtual communities.”
1988 More compact discs (CDs) are sold than vinyl records.
June 1990 General Instruments, an American manufacturer of cable television converters and satellite communications equipment, upsets the race to build the television of the future by announcing it has succeeded in squeezing a digital HDTV signal into a conventional broadcast channel. Up until then all the companies preparing proposals for an HDTV standard were working on analog systems.
1991 The first 2G cellular network is launched in Finland. 2G networks used digital signals rather than analog transmission between mobile phones and cellular towers, increasing system capacity and introducing data services such as text messaging.
July 1992 Tim Berners-Lee posts the first photo uploaded to the Web, showing the all-female parody pop group Les Horribles Cernettes(LHC), consisting of four of his colleagues at CERN.
May 1993 O’Reilly Digital Media group launches the Global Network Navigator (GNN), the first commercial web publication and the first website to offer clickable advertisements.
1994 Teradata has the largest commercial database at 10 terabytes.
Summer 1994 A large pepperoni, mushroom and extra cheese pizza from Pizza Hut is ordered online, possibly the first transaction on the Web.
October 1994 HotWired is the first web site to sell banner ads in large quantities to a wide range of major corporate advertisers.
1995 After a five-year pilot project, the National Digital Library program begins digitizing selected collections of Library of Congress archival materials.
June 1995 The Norwegian Broadcasting Corporation (NRK) launches the world’s first Digital Audio Broadcasting (DAB) channel.
November 22, 1995 Toy Story opens in U.S. theaters, the first feature-film to be made entirely with computer-generated imagery (CGI).
1996 Brewster Kahle establishes the Internet Archives, to preserve and provide access to nearly every site on the Web, later evolving to become a comprehensive digital library. Other Web archiving projects launched in 1996 include the National Library of Australia’s PANDORA Project, and the Royal Library of Sweden’s Kulturarw Heritage Project.
1996 Digital storage becomes more cost-effective for storing data than paper.
1996 E-gold is launched, becoming the first successful digital currency system to gain a widespread user base and merchant adoption.
1998 Jim Gray is awarded the Turing Award for seminal contributions to database and transaction processing research and technical leadership in system implementation.
1998 Production of analog cameras peaks at almost 40 million as they are replaced by digital cameras.
1998 Digital Television transmission commences in the U.K. and the U.S., launching the process of converting and replacing analog television broadcasting with digital television.
March 25, 1998 Microsoft patents ones and zeroes, says The Onion.
October 23, 1998 The Last Broadcast is the first feature-length movie shot, edited and distributed digitally via satellite download to 5 theaters across the United States.
December 1998 Nicholas Negroponte writes in Wired: “Like air and drinking water, being digital will be noticed only by its absence, not its presence.”
1999 Wal-Mart has the largest commercial database at 180 terabytes.
2000 The number of photos preserved on film annually peaks at 85 billion, rapidly replaced in subsequent years by digital photos.
September 2000 MP3 player manufacturer, i2Go, lunches a digital audio news and entertainment service called that enabled users to download news, sports, entertainment, weather, and music in audio format. In February 2004, Ben Hammersley writes in the Guardian: “Online radio is booming thanks to iPods, cheap audio software and weblogs… But what to call it? Audioblogging? Podcasting? GuerillaMedia?”
January 1, 2001 The Electronic Product code (EPC) is defined [PDF] at MIT as a replacement for the Universal Product Code (UPC or ‘bar code’).
2002 Digital information storage surpasses non-digital for the first time.
2003 More digital cameras than traditional film cameras are sold in the U.S. for the first time.
2003 Electronic payments in the U.S. surpass the use of cash and checks for the first time.
June 2003 The DVD format (launched in the late 1990s) becomes more popular than VHS in the U.S.
October 2003 The Check 21 Act makes check images a legal transfer medium in the U.S., allowing financial institutions to create a digital version of the original check. Over 50 billion paper checks were processed in the U.S. in 2003.
2004 Google announces it is working with the libraries of Harvard, Stanford, the University of Michigan, and the University of Oxford as well as The New York Public Library to digitally scan books from their collections. The Internet Archives starts a similar effort, the million book project.
2007 94% of the world’s information storage capacity is digital, a complete reversal from 1986, when 99.2% of all storage capacity was analog.
2008 More music is sold by iTunes than by Wal-Mart.
October 2008 Satoshi Nakamoto publishes “Bitcoin: A Peer-to-Peer Electronic Cash System,” describing the first decentralized digital currency. In October 2015, The Economist stated that blockchain, the technology behind bitcoin, “could transform how the economy works.”
2010 Online advertising ($26 billion) in the United States surpasses newspaper advertising ($22.8 billion) for the first time.
2010 Production of digital cameras peaks at just over 120 million as they are replaced by smartphones.
January 2011 Jean-Baptiste Michel, et al., publish “Quantitative Analysis of Culture Using Millions of Digitized Books” in Science. On a basis of a corpus of digitized texts containing about 4% of all books ever printed, they investigate linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000, calling their field of study “culturomics.”
2011 sells more Kindle books than print books.
2012 U.S. consumers pay more for online movies than for DVDs and Blu-ray discs, for the first time.
2012 180 petabytes (180 million gigabytes) are added annually to Facebook’s data warehouse which has grown 2500x in the past four years.
December 2012 Annual e-commerce sales top $1 trillion worldwide for the first time.
2014 Streaming revenue from services like Spotify and Pandora overtake CD sales for the first time.
February 2014 45% of Internet users ages 18-29 in serious relationships say the Internet has had an impact on their relationship.
Summer 2014 The number of Internet users worldwide reaches 3 billion.
2015 Michael Stonebraker is awarded the Turing Awardfor fundamental contributions to the concepts and practices underlying modern database systems.
2015 Every minute, Skype users make 110,040 calls, Twitter users send 347,222 tweets, YouTube users upload 300 hours of new videos, Pinterest users pin 9,722 images, Netflix subscribers stream 77,160 hours of video, Snapchat users share 284,722 snaps, and Facebook users like 4,166,667 posts.
2015 Digital America: A tale of the haves and have-mores, a McKinsey Global Institute (MGI) report, is the first major attempt to measure the ongoing digitization of the U.S. economy at a sector level. It introduces the MGI Industry Digitization Index, which combines dozens of indicators to provide a comprehensive picture of where and how companies are building digital assets, expanding digital usage, and creating a more digital workforce. Because the less digitized sectors are some of the largest in terms of GDP contribution and employment, MGI concludes that the U.S. economy as a whole is only reaching 18% of its digital potential and it estimates that digitization could add up to $2.2 trillion to annual GDP by 2025.
Gil Press

Gil Press is Managing Partner at gPress, a marketing, publishing, research and education consultancy. Previously, he held senior marketing and research management positions at NORC, DEC and EMC. Hewas Senior Director, Thought Leadership Marketing at EMC, where he launched the Big Data conversation with the “How Much Information?” study (2000 with UC Berkeley) and the Digital Universe study (2007 with IDC). He blogs at and Twitter: @GilPress

Software Modernization and your Smart Digital Future

Code modernization is essential in transitioning to digital business. Ancient code will have numerous liabilities in integration and in remaining secure. Fundamental to the problem is the fact that languages, programming approaches, and the surrounding IT are all evolving even as the business environment is evolving. This means that programs accumulate technical debt, leading to growing inefficiencies and maintenance costs over time. Continued accumulation of technical debt complicates any conversion effort. Yet, as we move into a future of newly designed smart processes and omnipresent digital interactions, it is certain that radical change and more invasive modernization will be necessary.

It is clear that a general approach is needed that leads both to effective conversion and to meeting the unknown requirements of the future. So, companies that wish to change need to centralize the modernization effort and discover the technologies which will be specifically applicable to the firm. In this context, it is important to consider the ROI of change efforts. Modernization must provide both for the current situation and for the unknown environment of the future.

One of the most persistent problems in modernization is migration of COBOL which exists in millions of lines across critical applications in high accuracy/high-volume areas such as finance and healthcare. These systems are particularly vulnerable as industry evolves to meet complex new requirements from clients and partners. While these systems have often operated for many years, as “black boxes” around which code might be wrapped, this approach eventually must break down. It entails a growing maintenance burden and serious security issues when pathways are built into the code to enable API access to obscure routines. Familiarity with the code base disappears as employees retire, and there is a growing lack of talent and experience in working with older programs.

To make essential changes and build for a digital and interconnected future, there is a range of possible remedies. These include:

  • Continuing the black box. Since the software has continued to operate for many years and performed its functions, you can hope that nothing bad will happen and simply continue the maintenance cycle. You will be faced with increasingly expensive maintenance, and potentially serious security flaws as the years drag on. There will also be an opportunity cost as new technologies are increasingly unavailable due to the lack of malleability in access to critical code.
  • Off-the-shelf replacement. It is sometimes possible to replace critical programs originally built in-house with commercial software or SaaS solutions. This often requires considerable customization, and will be less flexible. Processes may need to be changed, licensing costs will be incurred and there may be integration issues and unforeseen consequences.
  • Greenfield replacement. Starting from scratch demands creating a project of at least the size of the original one. All of the lessons learned in the original coding will be lost, and there are likely to be huge over-runs in time and cost in both adding new features and making certain that critical functions continue to operate as expected.
  • Manual conversion. Manual conversion or refactoring of the original system can be a massive project, potentially larger and more expensive than the original system. It is possible to introduce the modernized COBOL languages or move to later generation code. Without the specific knowledge of the original code and access to the programmer’s logic, much of the original functionality can be compromised. Such projects have very poor rates of completion on time and with adequate success. This will also be true of many “lift and shift” efforts which convert and bring the application to the cloud.
  • Incremental conversion. Large programs could be split up, with only critical “must change” code subject to conversion. This provides short term benefits, but it also potentially adds to technical debt in the interfaces, and the original code that persists will continue as a potential source of future problems.
  • Automated model-based conversion. For some situations, an automated conversion based on modeling can provide a cost effective outcome, depending upon the technology in use. Here, the original code is converted to a semantic model which is then optimized and used to generate code in another language.

Each situation is likely to have different needs, and demand a different solution. This is part of the reason that conversion has become such an intractable problem.

There are numerous companies involved with modernization of code and with bringing older programs into the digital environment—and huge differences, depending upon whether you are looking at a change of platform, a coding conversion, an update, a refactoring, or a rewrite of ancient routines. The most important issue is to determine what the overall modernization requirements are: what is absolutely critical, and what could be reserved for later. Modernization can be very expensive; but it also needs to be correct.