Skip to main content
When-were-computers-invented03.jpg
知的財産関連ブログ / Everyday IP: When were computers invented?

Everyday IP: When were computers invented?

Given that the subject of one of our recent articles in the Everyday IP series was cars, you might think we could not possibly pick an invention more critical to people's daily lives… but computers just might fit that description. After all, you are using one at this very moment to read this article, be it your home or work laptop or the smartphone in your hand — with more processing power than all but the most recent desktop computers.

Computers (and computing) are not only crucial to many individuals' personal and professional lives, but they also play a pivotal role in the creation, proliferation and preservation of IP. With that in mind, let us take a brief journey through the history of these machines while also examining how they affect the IP industry.

The first computers: From Greek contraptions and Eastern abaci to Charles Babbage

Ancient Greece can take credit for the creation of what could be reasonably considered the ur-text of the computer: a machine known as the Antikythera mechanism. The hand-powered mechanical invention, dating back perhaps as far as 200 BCE, consisted of a small wooden box (approximately 34 x 18 x 9 centimeters) containing up to 30 gears. It was supposedly used in early astronomical calculations.

More practical, however, was the abacus. Many ancient civilizations created these counting machines, including Sumer, Babylonia, Persia, Greece, Rome and China. In its earliest iteration in the Mesopotamian era, the abacus was a board that allowed letters to be traced in sand on its surface. Later devices employed thin metal rods placed in parallel rows within a simple rectangular box frame, with beads, marbles or other similar objects used as counters. They could be moved back and forth to complete all four arithmetic operations: addition, subtraction, multiplication and division, as well as basic square-root equations.

Today, you have a good chance of finding abaci in the early-grade classrooms of almost any primary school, and abaci are still prevalent in Middle Eastern nations, China and Japan.

Jumping ahead more than 1,500 years brings us to the origin of the word "computer." In 1613, when it was first coined, the term described a human being with the mathematical skills to perform complex calculations. About 188 years later, in 1801, textile merchant Joseph Marie Jacquard devised and built a loom that used wooden punch cards to direct some of the machine's functions, prefiguring elements of computers that would come more than a century later.

When-were-computers-invented
An automatic calculating machine, capable of performing complex computations, the Analytical Engine was designed by Charles Babbage as an improvement to his original Difference Engine. (Image source:  Mrjohncummings/Wikimedia Commons/CC BY-SA 2.0)

A far bigger development was English mathematician Charles Babbage's Difference Engine. In 1822, the computing pioneer conceived of a steam-powered calculating machine that could complete equations involving vast tables of numbers. He never managed to complete a fully functional model, but improved upon the design with his Analytical Engine in 1834. Though only partially built before his death, the programmable machine used punched cards for arithmetical calculations as well as rudimentary conditional branching, looping, microprogramming, parallel processing and other computations common in modern computers.

Reinventing the wheel: Early supercomputers

Though unrealized in life, Babbage's visions would presage the creation of 1940s room-sized computers like the Harvard Mark I and the Electronic Numerical Integrator and Computer (ENIAC). Unfortunately, much of the English pioneer's work had been obscured by the passage of time from his death in 1871 to the 1940s.

For example, ENIAC co-creator John Mauchly did not know about Babbage's use of punch cards or understand other ways in which his invention owed the Englishman a great debt. As best we can tell, Babbage made no attempt to patent either the Difference or Analytical Engines, which likely did not help in keeping his name alive nor his innovations at the forefront of research. Ultimately, both the ENIAC and Mark I creators essentially repeated Babbage's steps in various ways without realizing it.

But this era had plenty of original watershed moments of its own, not least of which was Alan Turing's notion of a "universal computing machine." In a 1936 paper, Turing posited that a computer programmed through instructions stored in the machine (at that time, as tape data) could calculate virtually any mathematical or scientific equation. His research would be integral to personal computer development and the earliest theories of artificial intelligence. Sadly, Turing was unable to capitalize fully on his genius — not because of any failure to protect his IP, but due to societal prejudice against his sexual orientation. This persecution culminated in an appalling 1952 arrest, conviction and imprisonment that drove him to suicide two years later.

The IP wars of Silicon Valley (and beyond)

In contrast to their predecessors' suppression and obscurity, some of the most prominent figures associated with modern computing landmarks are household names today. Think of Microsoft's Bill Gates and Apple's Steve Jobs and, to a lesser extent, Apple co-founder Steve Wozniak. Others are less known but have had plenty of noteworthy achievements: Alan Shugart's 1971 spearheading of the floppy disk, Bjarne Stroustrup's invention of the C++ programming language in 1980 and Tim Berners-Lee's 1990 creation of HTML (and the world's first web browser).

When-were-computers-invented01
The Macintosh 128K said "Hello" to the world on January 24, 1984, and was released by Apple Computers Inc. It included an Easter egg in the OS ROM designed to prevent unauthorized cloning of the Macintosh.

New developments were surfacing left and right in the world of computing, at a pace that only grew more and more frantic from the 1980s onwards. Pressure to develop the next big thing (and, arguably, a sense of ruthlessness within the industry) has at times pressured everyone from software developers and hardware engineers to product designers to "borrow" others' work. This has sparked numerous, often lengthy, legal battles.

Tech IP cases have reached the highest courts in the United States. The most recent example is Google v. Oracle, in which the latter accused the former of copying application programming interfaces (APIs) from its Java code. (Since Google asked the Supreme Court to hear the lawsuit, the case is no longer known as Oracle v. Google.) But because Java is open source, Google was always liable to triumph, and did so in April 2021 with a 6-2 Supreme Court ruling about 10 years after the case began. In a similarly drawn-out case, Apple and Samsung came to a 2018 settlement in the former's favor, seven years after Apple claimed Samsung had "slavishly" copied the iPhone design for its own smartphones.

Looking at these issues more broadly, Microsoft, under the direction of Gates, has aggressively gone after any supposed infringers of its IP, but also has a checkered history of monopolistic and anti-competitive practices, up to and including having allegedly stolen elements of MS-DOS from the earlier CP/M operating system. Many other computing giants have been accused of similar offenses.

The majority of most organizations' assets are intangible IP, and therefore much of today's IP field is characterized by tech designs and patents. It seems possible that tech-related disputes will form the lion's share of IP law proceedings for the foreseeable future.

If you find yourself in need of a persistent, globally present and highly knowledgeable IP law services and management firm to protect your tech patents, look no further than Dennemeyer.

Next article
csm_Intellectual_Property_and_business_value_making_IP_a_C-suite_priority_01_94a2571767
Intellectual Property and business value: making IP a C-suite priority

What are the three pillars of IP innovation, and how can C-suite members maximize the value of assets that are, by definition, less than concrete?