2017 Programming PSP - Gopal Othayoth
Even though computers (hardware) and software have only been around for a very short time, their impact on modern society and individuals has been tremendous - as important as electricity, the printing press, or the telephone. In just a few short years, programs have provided a vast array of inventions and capabilities, both good and bad, that permeate every part of our daily life. Without software, the modern world would grind to a halt almost instantaneously. Most businesses would come to a virtual standstill without programs for accounting, analytics, finance, human resources, order tracking, planning etc. Similar examples are pervasive in other fields such as government, military and medical environments - they all increasingly rely on computers and software to automate not only mundane tasks, but complex, artificial intelligence driven, decision making systems. Of course, not all aspects of programs are beneficial. There is a whole new world of software based threats and countermeasures that society and individuals encounter every day. Computer viruses and malware target everything from individual personal computers and phones to large government security systems. Good or bad, the impact of software is undeniable in today’s world (Jones). To really understand programming, one must look at what computers are, how they work using logic circuits, how they execute programs, and how programming languages help write these programs.
A modern computer is generally termed as an electronic machine (although older computers were mechanical). It generally has the capability to accept data (which is unprocessed facts, figures, symbols etc.) and instructions (which tell the computer what to do) and produce information (which is data that is organized, meaningful and useful) as output. Computers process data to create information. Modern computers use hardware and software. Hardware is the electric, electronic, and mechanical equipment that makes up a computer. Software (or programs) is the series of instructions that tells the hardware how to perform tasks. Computer hardware typically includes a system unit, input and output devices, memory for storing data and instructions, and devices for storage and communications. Software provides the set of instructions for the hardware to execute and operate (Lemley, Chapter 1). Ultimately, at their core, computers use fundamental mathematical principles and electronic circuits to do their work.
Internally, computers are nothing but very advanced calculating machines that use logic circuits that perform numeric calculations. They do not typically use the normal number systems that people use, which is the decimal system. Unlike the decimal system, which has a base of 10 and has nine digits (0 through 9), computers use a system called binary, which has a base of two and only two digits (0 and 1). The reason for this is that 0 and 1 relate very easily to the ON/OFF states of a simple switch. Computers use millions or even billions of these switches to perform their operations. Most of the logic in computers is performed using specialized electronic circuits called gates. There are different kinds of gates, each performing a specific operation. For example, a type of gate called an AND gate only produces an output of 1 (ON) when both its inputs are 1 (ON). Another type of gate called an OR gate produces an output of 1 (ON) when either of its inputs are 1 (ON) (Petzold, 102-130). Using a combination of these gates, computers can perform basic addition. Everything else, all the complex programs you see, comes from this basic function - Subtraction is derived from negative addition, multiplication from repeated addition, division from repeated subtraction, and so on (Petzold, 132-154).
Using these logic circuits, computers typically execute instructions that are called programs. Since computers mostly only understand how to add things together, anything you want the computer to do must be laid out in these terms. Computers use a language called machine language to specify where things like operators and operands are and what to do with them. For example, a computer might be given an instruction in machine language to load two numbers (say 2 and 3) and an operator (say + for addition) in some location in memory (usually called register memory) and then perform the operation (Petzold, 132-154). This is very very hard to do for humans since there will be thousands or even millions of instructions needed to do even the simplest tasks. To solve this problem, scientists have invented programming languages that allow humans to write out things in a simpler, more understandable form and then they use computers to translate this into the thousands or millions of instructions that computers can understand. These languages are called high-level languages. There are two main kinds of programs that perform this translation: Compilers and Interpreters. Compiler take the high-level language and perform a one time translation into machine instructions. The resulting program can be used without the original source or even the compiler being available. Interpreters, on the other hand, interpret the original language each time (on the fly) and perform those operations. The advantage of interpreters is that it is very easy to change something in the original program and see the effects quickly, but the downside is that the original source and the interpreter needs to be available each time (Petzold, 349-363).
Machine languages are hard for humans to read. For example, a partial set of machine language instructions may look something like this:
main proc
mov eax,5
add eax,6
This is the lowest you can go in programming (known as Assembly) without writing out ones and zeros. Although it will be relatively easy for a machine to interpret and execute this sequence of instructions, it is still hard to read for humans. This could cause several issues - for example, debugging this code can become extremely difficult. Such programs can get incredibly long and complex, so people invented high-level languages to make their jobs easier (Sethi, 5-7). Using a higher-level language such as C or HTML has many benefits, such as readability (e.g. for easy debugging), the ability to check for errors during implementation, machine independence, and the availability of program libraries (e.g. jQuery for JavaScript). For example, the Unix operating system (which was originally written in assembly language) was rewritten in C in 1973. This led to more users and programs because users did not want to learn assembly language, and C was a higher-level alternative. Other high level languages include Fortran, Pascal, C++, C# .NET, Visual Basic .NET, Smalltalk, LISP etc (Sethi, 7-8).
In general, there are a few different types of programming languages. Some of these are imperative, functional, object-oriented and logic types. Imperative languages are action-oriented, which means that a computation is viewed as a sequence of actions. For example, Pascal and C are imperative languages. They are generally procedural in nature and rely on specific sequences of well-defined steps (Sethi, 11). On the other hand, the basic concepts of functional languages originated with LISP, a language designed in 1958 for Artificial Intelligence (AI). The LISP language was designed primarily for symbolic data processing and has been used for symbolic calculations in differential and integral calculus, electrical circuits theory, mathematical logic, game playing and other fields of AI. Functional languages tend to encapsulate things as functions that can be evaluated (Sethi, 14). A different type of programming, object oriented programming, evolved from concepts used in computer simulations. Examples of object oriented languages are simula, C++ and Smalltalk. Object oriented languages treat programming concepts as objects that have attributes and behavior - for example, all quadrupeds have four legs (property or attribute) and can walk (method or behavior) (Sethi, 15). Last but not least, we have logic programming, which uses logical reasoning to deduce answers to problems. The concept first started with the invention of the Prolog language in 1972 which was initially used to for natural language processing in French and human machine interactions. Prolog uses reasoning like this for example:
Every dog is an animal
Every dog barks
Sam barks
Is Sam a dog?
(Sethi, 16).
Given that programming has made so many advances in such an incredibly short period of time, the future is boundless. Programs are starting to evolve to the point where people can now do simple tasks just by speaking or writing them down in English or another human language - Natural language processing programs are intelligent enough to pick up this information and turn it into machine language. Speech recognition and translation of human languages using computers has become so commonplace that people now use it in their phones on a daily basis. Other growing aspects of software include biometrics (for example, facial recognition and behavior recognition), machine learning (systems capable of analyzing and understanding large amounts of data through deep learning), Artificial intelligence and expert systems. Software now controls airline flights all the way from takeoff to touchdown that pilots are only needed for emergencies and on standby. Self-driving cars powered by software are already on the roads and promise as safe a driving experience (or even safer) than a human. There are no limits to what software programming can do, since it is powered by imagination and engineering.
Works Cited
“Chapter 1 - Introduction to Computers.” Chapter 1 - Introduction to Computers, University of West Florida, uwf.edu/clemley/cgs1570w/notes/Concepts-1.htm. Accessed 12 Apr. 2017.
Horn, Delton T. Basic Electronics Theory: with Projects & Experiments. Blue Ridge Summit, Pa, Tab Books, 1981.
Jones, Capers. “Informit.” The Impact of Software on People and Society | Prelude: Computing from Ancient Times to the Modern Era | InformIT, InformIT, 23 Dec. 2013, www.informit.com/articles/article.aspx?p=2163344&seqNum=5. Accessed 13 Apr. 2017.
Lemley, Linda. “Computer Concepts and Applications.” Notes, Pensacola Junior College, 18 Aug. 2005, uwf.edu/clemley/cgs1570w/notes/index.htm. Accessed 13 Apr. 2017.
McConnell, Steve. Code Complete: Redmond (Washington), Microsoft Press, 2016.
Niederst, Jennifer. Learning Web Design: a Beginner's Guide to HTML, Graphics, and Beyond. Beijing, O'Reilly, 2001.
Petzold, Charles. Code: the Hidden Language of Computer Hardware and Software. Redmond, WA, Microsoft Press, a Division of Microsoft Corporation, 2000.
Sethi, Ravi. Programming Languages: Concepts and Constructs. Reading, MA, Addison-Wesley, 1997.