Please wait while the search results are loading...

Introduction to the InfoTech Industry, Business and Industry Trends Analysis

The technology breakthrough that enabled the modern computer occurred shortly after the end of World War II, when researchers at Bell Laboratories in New Jersey created the first working transistor on December 16, 1947.  William Shockley, John Bardeen and Walter Brattain later received a well-deserved Nobel Prize in Physics for their groundbreaking work.
What started with one transistor has grown at an astonishing rate.  Consider the steady evolution of chips from Apple and Intel:  In 1978, Intel’s wildly popular 8086 processor contained 29,000 transistors.  At the time, this was an immense leap forward in computing, and it opened the door to what would eventually become a vibrant industry that manufactures powerful desktop computers.  Fourteen years later, the first Pentium processor was introduced by Intel in 1993, with 3.1 million transistors.  Today’s most powerful chips each contain billions of transistors.  Worldwide sales of semiconductors were $533 billion in 2023, according to Gartner, down 11% from 2022 after COVID-era demand for work-from-home computers and new entertainment devices cooled off.

Internet Research Tip:

     Analysts at Gartner forecast global spending for InfoTech (including hardware, software, services and telecommunications) at $4.6 trillion for 2023. 
The InfoTech industry is a truly globalized sector.  Asia has grown to be one of the top spots worldwide for IT expenditures, research and manufacturing.  Computer technology research, the development and manufacture of components and the assembly of completed systems have grown quickly in the labs and manufacturing plants of China, Taiwan, South Korea, Singapore and Japan, among other nations.  Computer services continue to move offshore quickly, particularly to the tech centers of India.  Asian technology brands are very powerful, including Asus, Samsung and Lenovo.  Meanwhile, the leading U.S. brands, including Apple and Dell, have most of their equipment manufactured in state-of-the-art factories in Asia.
The 1970s and 1980s were often called the “Information Age,” and the 1990s can be singled out in history as the beginning of the “Internet Age.”  The first few years of the 21st Century might be called the “Broadband Age” or, even better said, the “Convergence Age,” as entertainment, news, telephony, data and video converged onto internet-connected devices.  By 2010, however, the world had clearly entered the “Mobile Age,” with the growing capabilities and appeal of smartphones.
That trend continues today, but the age of the near future will be the “Connected Devices Age” or perhaps the “Pervasive Computing Age,” as the Internet of Things (IoT) comes into being.  That is, computing devices will surround us at all times in all places, largely interconnected and communicating with both the user and the environment around the user.  A combination of Wi-Fi, cloud computing, remote wireless sensors, 5G cellular networks and incredibly advanced mobile devices is accelerating this trend.  Wearable sensors and computers are an obvious, early manifestation of this trend, with personal fitness monitors and smart watches being pioneers in this regard.  Perhaps more important is the recent boom in personal digital assistants like Google Home and Amazon Echo, connected to Wi-Fi networks, which provide instant, voice-activated access to entertainment, search and other features.
Vastly more smartphones are sold each year than PCs on a worldwide basis.  When you add in platforms such as tablets and notebooks (and even digital entertainment systems in vehicles) connected wirelessly to the internet, the shift to mobile computing is even more pronounced.  It isn’t that the PC is dead, but the PC is being relegated to a lessened status, far outshined by mobile devices.
Approximately 1.43 billion smartphones were sold in 2022 according to Gartner.  While Apple's iPhone remains extremely popular, the Android cellphone operating system made by Google has a dominant market share, not only because Google distributes the Android operating software free-of-charge, but also because it is a fine piece of technology.
Today, broadband sources such as fiber-to-the-premises (FTTP) provide very high-speed access to information and media, creating an “always-on” environment for computer users at home and in the office.  Mobile computing is accelerating this environment of constant access to data and communications.  
Broadband access has been installed in enough U.S. households and businesses (more than 126.8 million fixed home and business subscriptions by December 2023, according to Plunkett Research estimates, plus 390.3 million wireless internet connections) to create a vast mass market, fueling demand for new internet-delivered services, information and entertainment.  Growth in broadband subscriptions worldwide is very strong.  There were more than 5.4 billion fast internet users worldwide (including wireless) by 2023, according to the International Telecommunication Union (ITU).  Continuous technological progress is moving in-step with this rapidly expanding user base, leading to a steady evolution in the way we access and utilize software applications, including the soaring growth of cloud computing.  Over the next few years, significant groundbreaking products will be introduced in areas such as energy efficient chips, artificial intelligence, optical switches and networking technologies, and advances will continue to be made in quantum computing.  The biggest single challenge may be the vital need to make systems more secure from hacking, phishing and account takeover—from the largest enterprise-level systems to the devices and accounts of individual users.
InfoTech continues to enable new efficiencies on a continual basis.  M2M (machine-to-machine) communications via remote wireless sensors, internet-connected appliances and digitally-controlled machinery and equipment will eventually mean that the world’s industrial activity, transportation, supply chain, environmental controls and infrastructure will be interconnected digitally.  (These activities are generally referred to as “IoT,” or “the internet of things.”) One of the biggest opportunities facing the IT industry for the mid-term is the harvesting and analysis of big data from this increasingly complex network of machines and sensors.
The health care industry is undergoing a technology revolution of its own.  Patient records are finally digital, and RFID is making hospital inventories more manageable.
For businesses, the stark realities of global competition are fueling investments in computing systems and software.  Demands from customers for better service, lower prices, higher quality and more depth of inventory are mercilessly pushing companies to achieve efficient re-stocking, higher productivity and faster, more thorough information management.  These demands will continue to intensify, partly because of globalization.
Businesses are paving the paths to their futures with vast sums invested in computing because:  1) substantial productivity gains are still possible, particularly in the era of advances in artificial intelligence; 2) the relative cost of the technology itself has plummeted while its power has multiplied; and 3) competitive pressures leave them no choice.

A Brief History of PC Milestones
=    1971: Intel introduces the first microprocessor, the 4004.
=    1976: Popular hardware enters the market, but software is lacking. Apple introduces the Apple II personal computer. Commodore introduces the PET. Radio Shack enters the market with the TRS-80.
=    1981: IBM finally enters the market with the IBM PC, based on a Microsoft operating system called DOS.
=    1982: Clones compatible with the IBM PC enter the market. Since IBM did not acquire exclusive access to MS DOS, clones are able to compete effectively with their own DOS-based PCs.
=    1984: A cult is born when Apple introduces the Macintosh. Dell Computer is launched by a college student in Austin, Texas.
=    1990: Microsoft introduces a leap forward with Windows 3.0.
=    1993: Mosaic is born, the first graphics-based web browser. The internet is ready to surge.  Intel’s Pentium processor is launched, with 3.1 million transistors.
=    1994: Online directory giant Yahoo! is launched by two Stanford University students. Version 1.0 of open software Linux is released.
=    1995: is launched. Netscape, maker of the first widely used internet browser, sells its stock to the public.
=    1996: eBay is launched. Microsoft introduces the Internet Explorer browser.
=    2003: Much faster 64-bit chips are put on the market.  Wi-Fi and other wireless technologies advance and proliferate.
=    2004: Open systems, such as Linux and Mozilla, move ahead broadly, gaining widespread acceptance over a wide variety of platforms.
=    2007: Apple launches the iPhone, making the smartphone the personal computing device of choice for many users. 
=    2010: Apple launches the iPad tablet computer.
=    2011: The cloud gains wide acceptance as an efficient place for data storage and collaboration.
=    2012: The era of The Internet of Things (IoT) begins as machine-to-machine communication gains wide interest.
=    2014: Amazon’s Echo, a wireless, digital personal assistant, is introduced, ushering in a new era of voice-activated user interface to search and entertainment.
=    2015:  China is an undeniable giant in the technology industry, as a contract electronics manufacturer, and as the home of rapidly growing firms such as Alibaba (ecommerce) and Xiaomi (smartphones).
=    2019: A 128-qubit quantum system becomes available in late 2019, through Amazon Bracket, a fully managed AWS system
=    2020:  The flexibility of cloud-based systems enables work-from-home on a massive scale during the Coronavirus.  Ecommerce soars, but online fraud and account takeover also soar.  IBM announced in early 2020 that it had more than 12,000 users per month for its IBM Q Network, consisting of 15 publicly available quantum computers, in sizes ranging from five to 53 qubits.
=    2021-2024:  Generative AI system ChatGPT was launched by OpenAI on November 30, 2022.  The world’s largest semiconductor manufacturers announce hundreds of billions of dollars in investments in newer chip manufacturing plants, totaling hundreds of billions of dollars over the mid-term.  Artificial Intelligence (AI) continues to grow rapidly across hundreds of categories and industries.  Microsoft makes massive investments in leader OpenAI.
Source: Plunkett Research, Ltd.

A Representative List of Organizations that Have Used our Research and Products:


I’m amazed at how much information is available and the various ways to access it. This will be a major resource for our serious job seekers.

Career Services, Penn State University

Plunkett Research Online provides a great ‘one stop shop’ for us to quickly come up to speed on major industries. It provides us with an overall analysis of the market, key statistics, and overviews of the major players in the industry in an online service that is fast, easy to navigate, and reliable.

Wendy Stotts, Manager, Carlson Companies

I really appreciate the depth you were able to get to so quickly (for our project). The team has looked through the material and are very happy with the data you pulled together.

Hilton Worldwide, Marketing Manager

We are especially trying to push Plunkett since all of our students have to do so much industry research and your interface is so easy to use.

Library Services, St. John’s College

We are especially trying to push Plunkett’s since all of our students have to do so much industry research and your interface is so easy to use.

Gary White, Business Materials Selector, Penn State University

Your tool is very comprehensive and immensely useful. The vertical marketing tool is very helpful, for it assists us in that venue, as well as targeting customers’ competition for new sales…The comprehensive material is absolutely fabulous. I am very impressed, I have to say!

Tammy Dalton, National Account Manager, MCI

The more I get into the database, the happier I am that we’ll have it–REALLY happy!!! Between the quality and affordability of your product, its appeal to and value for our users, and the inestimably ethical and loyalty-guaranteeing conduct of your business, I will always have more than sufficient praises to sing for Plunkett Research.

Michael Oppenheim, Collections & Reference Services, UCLA

Plunkett Research Online is an excellent resource…the database contains a wealth of useful data on sectors and companies, which is easy to search and well presented. Help and advice on how to conduct, export and save searches is available at all stages.

Penny Crossland, Editor, VIP Magazine
Real Time Web Analytics