Tech Topic Connection

Photo by Olia Danilevich: https://www.pexels.com/photo/man-sitting-in-front-of-three-computers-4974915/

Programming languages are an integral part of the fundamentals of information technology, allowing us to communicate with computers through instructions and computational thinking like an algorithm to help solve problems. According to Vahid and Lysecky (2019), “computers can only represent two values (0 or 1), so base two numbers, known as binary numbers ("bi" refers to two) are used” (Section 1.5). A set or bit of binary numbers can then be used to create a hierarchy and evolution of program languages that can be used to develop software applications and operating systems, access databases, or automate tasks. Programming languages also have provided a way to help secure computing systems and networks while protecting individual and organizational privacy. Furthermore, computing, in general, would not exist without programming languages which have continued to develop since the early 1900s.

Looking at the brief history of programming languages, we can better understand how it relates to information technology concepts and how computers operate. According to Paulsen (2014), “the history of programming languages has been largely concerned with recounting the creation of specific languages and the evolution of particular technical features” (p. 38). In 1804 the French silk weaver Joseph Marie Jacquard created a program to weave specific patterns using punch cards. From there, Charles Babbage, the computer's father, made a different engine, the first automated mechanical computer which used logic and memory to calculate polynomial functions. In addition, Ada Lovelace was the first person to create a computer program that used variables to help translate a document from French to English. Furthermore, most of these inventions used punch cards like the zeros and ones we use today in modern computing, giving us a glimpse into the history of programming languages and their role in the evolution of information technology.

Additionally, programming languages rely on hardware or the physical parts of a computer to function efficiently and effectively. Computers use a central processing unit (CPU) to run instructions or code from different programming languages. Random Access Memory (RAM) is also used to store these codes temporarily for the CPU to access. Without it, a computer would be less responsive because hard drives are typically slower than RAM and a CPUs Cache. Programming languages also allow us to use non-volatile storage like USB flash, which uses code to write and remove data from portable memory drives. Computer programming is also found in all types of computers, including servers, personal computers (PC) or desktops, tablets, and smartphones. It is also an essential part of supercomputers which, according to Vahid and Lysecky (2019), “is used for computationally intensive jobs often involving big data and can execute trillions of instructions per second on large data sets” (Section 2.3). Coding also allows users to interact with these devices through input and output devices like keyboards, mice, monitors, and printers.

Once a user can input, output, and see information on a computer, programming languages can be used to write programs like an operating system or word processor. Programming languages are implemented or executed with two standard approaches, compilation, and interpretation which can be used together. According to FreeCodeCamp (2020), “compiled languages are converted directly into machine code that the processor can execute” (Para. 5). In contrast, “Interpreters run through a program line by line and execute each command” (Para 8). In addition, programmers use different programming languages like Python, C, C++, and Java for various applications, with Java being one of the most popular. According to Samoylov (2018), “Java stood out due to its simplicity, portability, interoperability, and safety nets” (p. 10). It gave programmers the environment and tools needed to create applications for computers, smartphones, and other devices with fewer chances of memory leaks prone to C and C++. Furthermore, programmers could create databases that store data tables in rows and columns that could be accessed using programming languages like Structured Query Language (SQL). It also allows programmers to protect computer systems and networks by creating programs like anti-viruses, anti-malware, and software updates.

Furthermore, programming languages have been instrumental in creating the internet and other networking services, such as cloud computing and social media, allowing more information to be stored. This information is valuable and often comes under constant threat. These threats use programming languages to create instructions to illicit unauthorized access to networks and computing systems. Some hazards include viruses, worms, trojans, adware, and spyware. Coding is also used to conduct ping command attacks on websites which often use a botnet to overload a server or website with fake requests which flood a system and prevent it from responding to genuine requests. To combat these challenges, programmers turned to program languages to create ani-viruses and other security applications to help disable and quarantine existing threats. Many of the programming languages used to develop legal applications are also used by cybercriminals, like Java, Python, and SQL. Java is often used to create botnets or steal information. Similarly, Python and SQL are used to access computing and database systems. It is essential for cybersecurity specialists to be proficient in all the programming languages used by cybercriminals to help find and resolve security breaches.



   References

FreeCodeCamp. (2020, January 10). Interpreted vs Compiled Programming Languages: What’s the Difference? FreeCodeCamp.org. https://www.freecodecamp.org/news/compiled-versus-interpreted-languages/

Paulsen, G. (2014). When Switches Became Programs: Programming Languages and Telecommunications, 1965-1980. IEEE Annals of the History of Computing, Annals of the History of Computing, IEEE, IEEE Annals Hist. Comput, 36(4), 38–50. https://doi.org/10.1109/MAHC.2014.64

Samoylov, N. (2018). Introduction to programming : learn to program in java with data structures, algorithms, and logic. Packt. https://ebookcentral.proquest.com/lib/ashford-ebooks/reader.action?docID=5434477

Vahid, F., & Lysecky, S. (2019). Computing technology for all. zyBooks. https://learn.zybooks.com/zybook/TEC101:_Fundamentals_of_Information_Technology_&_Literacy_(TED2249A)

Comments