9 Programming Geniuses Who Changed the World

Programming has changed the world for the better. It has made our lives so much easier by solving problems that we didn’t think would be solved.  Imagine being in 16th-century Europe and wanting to connect with someone from Asia without walking an inch. People would have declared you mad. It would be an impossible thing to do.

But here we are in the modern world where almost everything seems possible. Have you scrolled through Instagram to see what’s happening around the world? Have you used a video conferencing app to speak with your friend recently? Or did you watch a good Netflix series over the weekend?

Every technology we use today relies on programming and its principles.

Programming is the process of creating a set of instructions that tell a computer to perform a set of tasks. With the advent of programming, it’s become very easy to scale things by getting the computers to do the regular work thus allowing the human brain to work on bigger problems. 

Not so long ago, programming was an unnamed field. But, over the past century, we have seen many mathematicians, engineers, and programmers build the theoretical foundations and the technical framework for modern-day technology.

So without further ado, let’s look at these top programmers who created the playing field on which modern-day developers play. These 9 names have had the biggest impact on how the world has been shaped and what the future holds for us –

Alan Turing (1912 – 1954)

“One day ladies will take their computers for walks in the park and tell each other, “My little computer said such a funny thing this morning.”

Alan Turing

Let’s start the list with the father of modern computers- Alan Turing. This British Computer Scientist created a machine that simulated computer algorithms (Turing Machine) and wrote about the possibility of computer intelligence.

Turing got admitted to the University of Cambridge in 1931 where he studied mathematics. During his college days, he drafted research in Probability Theory which landed him a fellowship at King’s College. In 1936, he famously attempted to solve the Entscheidingsproblem puzzle and invented the model for the modern computers in use today.

During the second world war, Turing successfully decoded the Enigma cipher machine encryption devices that were being used by the German military to communicate in a coded language.
He along with his colleagues developed a machine called the Bombe that was able to scan through nearly 159 billion-billion possibilities. This led to a huge turning point in the war for the allied forces and helped them win crucial battles against the Nazis.

Despite his pathbreaking effort, Turing hasn’t been acknowledged much because of his homosexual orientation.

Today, his name lives on via the Turing Award – the highest award to win in Computer Science.

Dennis Ritchie (1941-2011)

“C is quirky, flawed, and an enormous success”

Dennis Ritchie

An American Computer Scientist, Dennis Ritchie developed one of the most widely used programming languages – the C language. Born in Bronxville, New York, in 1941, he graduated in physics and applied mathematics from Harvard University.

While working at Bell Laboratories, he met Ken Thompson, and together they built a multi-tasking, multi-user operating system, which we know as UNIX. They built it as an alternative to the batch processing systems that ran on only one type of hardware.

Dennis Ritchie’s contributions to programming and computing spanned four decades, casting an unforgettable impact across the globe.

The C language is still widely used today to write the software for digital services. And the UNIX and UNIX-like operating systems pretty much run the machinery from computers to smartphones.

Ritchie died in the year 2011 but his ideas still live on, as an inspiration for the modern operating systems design, and for almost every new programming language out there.

James Gosling (1955-present)

“Java is C++ without the guns, clubs, and knives”

James Gosling

Born in May 1955 near Calgary, Canada, James A. Gosling is best known as the father of Java – the world’s most praised programming language.

Gosling received a B.Sc in Computer Science in 1977 from the University of Calgary. Later, he went on to pursue a Ph.D. from Carnegie Mellon University where he wrote the thesis titled “The Algebraic Manipulation of Constraints”.

During his Ph.D., Gosling built the multiprocessor version of UNIX along with several compilers and mail systems.

Gosling then joined Sun Microsystems in 1984 and worked there for 26 years before Oracle acquired the firm.

In 1991, Gosling along with some of his colleagues started developing a new language, which, unlike C++,  would be platform-independent and could be used to program other devices as well.

And after 18 months he developed the first working version of this language called OAK, later renamed JAVA in 1995.

This invention changed the software world forever with its great features and simplicity and continues to be the language of the present and the future.

We use Java to build applications and platforms for a number of devices including computers, gaming consoles, Blu-ray players, medical monitoring devices, car navigations, and smartphones to name a few. 

As the Java homepage says – “More than 1 billion computers and 3 billion mobile phones worldwide run on Java.”

For this meteoric achievement, Gosling was elected to the United States National Academy of Engineering.

He has worked with both Google and Amazon and also built a startup named ‘Liquid Robotics‘.

Guido Van Rossum (1956 – present)

“Modern Programs must handle Unicode – Python has excellent support for Unicode, and will keep getting better”

Guido Van Rossum

If anyone has eased the process of turning an idea into a functional code, it’s Guido Van Rossum with the creation of the Python programming language.

Van Rossum was born and brought up in the Netherlands, and he completed his Master’s in mathematics and computer science from the University of Amsterdam in 1982.

In 1986, he helped in the development of the ABC programming language where he gained a lot of experience from the co-developers.

Interestingly, he took up a hobby programming project in December 1989, to keep himself occupied during the Christmas holiday week, and created a new scripting language, naming it Python.

Today, Python is the second most popular language on GitHub, and is one of the most mentioned languages in job postings for the following reasons –

  • Easy and intuitive language
  • Open source, meaning anyone can contribute to its development
  • Code as understandable as plain English
  • Suitable for everyday tasks

Van Gossum later worked in Google, and Dropbox and has been reportedly working in Microsoft since 2020 in the developer division.

Brendan Eich (1961-present)

“If the web can be evolved to include the missing APIs and have better performance, [Developers] won’t need to go beyond the web

Brendan Eich

This American techie is the Godfather of JavaScript – the number one programming language at present.

He received his bachelor’s degree in mathematics and computer science from Santa Clara University and went on to complete his Master’s in 1985 from the University of Illinois.

While working at Netscape Communications  Corp, Eich was asked to develop a new language that resembled Java for the Netscape Web browser. Eich completed the first version in just a matter of 10 days, which he originally called Mocha and then later renamed it to JavaScript.

Brendan Eich also co-founded the Mozilla Project, the Mozilla Foundation, and the Mozilla Corporation.

At present, Eich is the co-founder and CEO of Brave Software, an internet browser platform company.

Tim Berners-Lee (1955-present)

“You affect the world by what you browse”

Tim Berners-Lee

Can you imagine a world without “www.”?

We can’t either.

Tim Berners-Lee invented the world wide web.

Born in 1955, in London, Sir Tim had a natural inclination towards computers in his early life as both his parents worked on one of the earliest computers – Ferranti Mark 1.

He received his graduation from Oxford University and started working as a software engineer at CERN, a physics lab near Geneva, Switzerland.

While working, he noticed that the scientists at CERN were having difficulties in sharing information as they had to log on to different computers to get information.

Sir Tim embarked on his way to solve this problem. He realized that millions of computers could share information by using an emerging technology called hypertext.

In March 1989, he laid out his vision for the ‘web’ in a document titled “Information Management: A Proposal”. Though experts didn’t immediately accept his project, his boss Mike Sendall gave him permission to continue work on it.

A year and a half later (October 1990), Sir Tim came up with three fundamental technologies that make up the web of today – HTML, URI, and HTTP.

Besides developing the world wide web, his greatest contribution was in making the web royalty-free. Today, all of us can access ‘www’ without paying a single penny, and all credits go to Sir Tim Berners Lee.

For his revolutionary work, he received the royal medal in 2000, the order of the British Empire in 2004, and the Turing Award in 2017 among many other accolades.

Joseph Carl Robnett Licklider (1915-1990)

[The computer is also the direct descendant of the telegraph as it enables one…to] “transmit information without transporting material”

J.C.R Licklider

Today, the biggest companies such as Google, Amazon, and Microsoft have adopted Cloud Computing. But who proposed the Cloud?

It was Dr. Joseph Carl Robnett Licklider who formulated the earliest ideas of an ‘Intergalactic Computer Network’ in 1962. Today, innumerable software systems use the Cloud for efficient storage, eliminating the need for local data centers.

Joseph studied psychology, mathematics, and physics at Washington University. He received his bachelor’s in 1937, a master’s degree in psychology in 1938, and a doctorate from the University of Rochester.

He was way ahead of his time and his far-sighted ideas still live on in many of the features we use today including- graphical computing, user-friendly interfaces, e-commerce, and online banking.

John McCarthy (1927-2011)

“He who refuses to do the arithmetic is doomed to talk nonsense”

John McCarthy

Artificial Intelligence is the buzzword today. We hear talks about a future with robots, metaverse, etc. all the time. Numerous products and services have already incorporated AI into their system.

But do you know who formulated this idea for the first time? It was none other than John McCarthy who coined the term in 1955. His main research involved the formalization of common sense knowledge.

In 1958, McCarthy created the computer programming language LISP which was primarily used by the AI community.

McCarthy completed a bachelor’s degree in mathematics (1948) from the California Institute of Technology and a doctorate from Princeton University later in 1951.

Besides founding the Stanford Artificial Intelligence Lab (SAIL), he also developed ideas about the processing characteristics of trees in computing. McCarthy received the A.M Turing Award in 1971 for his contributions to the field, followed up by the Kyoto Prize (1988), and the National Medal of Science in 1990.

Bill Gates (1955-present)

“Software is a great combination between artistry and engineering”

Bill gates

Perhaps the most commonly known name on this thread, Bill Gates needs no introduction. This American computer programmer and entrepreneur co-founded Microsoft Corporation- the world’s largest personal computer software company.

Gates wrote his first software program when he was just 13 years of age, and since then he has never looked back. In 1975, Gates along with his hometown fellow Paul Allen developed software for the first microcomputers, using BASIC. As the project came out successful, Gates dropped out of Harvard during his junior year and founded Microsoft with Allen.

The microcomputer industry was fresh at that time and Gates built an early influence by licensing MS-DOS (Operating System) to IBM – the world’s biggest supplier of computers back then. By 1990, at the age of 35, Bill Gates had become the ultimate kingpin of the PC industry.

Besides his programming acumen, Gates actively participates in philanthropy work and is a major influencer in today’s world.

Summary

Currently, we are in between the fourth industrial revolution. The software industry is influencing more and more areas of our lives.

The famous programmers mentioned above kickstarted the industry in the 20th century but it’s fair to say that it’s just the beginning.

From search engines and websites, now we’re steadily moving towards Artificial Intelligence, Web3, and Metaverse. And like they have always been, programmers are going to be at the forefront of these developments as well.

Do you want to be a contributor to shaping the future with your ideas and creations? Do you want to get on the bandwagon of technological advancements that would make life easier?

If yes, start now by taking our computing-related courses for Full-stack web development and Data analytics and enter into the world of software with a bang.

Learn the fundamentals of Data structures and Algorithms to kickstart your programming journey.

In the words of famous software engineer Kent Beck –

“Make it work, make it right, make it fast”

Take fast action!