Internet and Social's Influence on Programming Evolution
Internet and Social's Influence on Programming Evolution
By Yulian Kuncheff
Progress & Changelog:
- Oct 22nd - Began indepth research
- Oct 23rd - Spent some time finding correlations
- Oct 23rd - Created progress powerpoint for class and solidified a bit of what I am going after.
- Dec 11th - Removed powerpoint outline, research plan, goals for my resrarch, and added final write-up. Updated sources with new sources used. Added Images.
As technology and science evolve, programmers and programming languages have had to grow and evolve along with it. Most of these changes have been influenced by demand by the consumers. As computers and technology became more widespread, consumers needed and wanted easier to use technology. This tasked programmers to create new ways of writing software, come up with languages to make it easier for themselves to create better and easier software to use. I will first do a quick rundown of how languages progressed and came into the scene.
Before 1967, most languages did not really follow a paradigm. Languages were written to make programming easier, faster, and allow for more complex things to be done with them. Around 1967, languages started showing signs of following paradigms. Simula and Smalltalk were one of the first object-oriented languages, Prolog was one of the first Logical programming languages, ML and Scheme were some of the first functional languages, and B, C, Pascal, and Forth were some of the first Imperative languages.
In the last 10 years, languages have become more Internet focused and feature many mixed paradigms and features. The goals of programming languages now is to make programming easier and quicker to use for programmers. This is to increase turn-around times and ROI of ideas. In this time span, C#, F#, Groovy, Scala, Factor, Clojure, and Go came out and brought with them rich multi-paradigm feature sets build on past languages and virtual machines. An example would be C#, it being an imperative, functional, and logical programming language based on how you use it.
For the most part, not much has changed for the big languages like Java or C. But C++ is on a slow decline, along with Perl and Visual Basic. While programming languages that are used for today's leading technologies are on a rise, like Objective-C because of the iPhone, and Python. These trends fluctuate a lot with every revision of the language and as new libraries and technologies are implemented in these languages. Java seems to reign supreme, and will most likely hold this title for many more years, until there is a platform that can match it in power. The closest I believe are C#, C, and Python, and as these languages grow and become more widespread, they will begin to eat away at Java's share.
So how has the socialization of the Internet and computers affected how programmers and programming languages are used? Well, the primary thing, is that consumer expectations of how fast, and how easy to use something is has put a burden on programmers to squeeze as much power and ease out of programming in order to release their products quicker to the market with more features and polish. Languages now handle all the memory and grunt work of the system, so all that programmers need to do is to write the logic of their program. But as consumers, employers, and others learn of how these systems make it easier for the programmer, they now expect results faster with less errors. The room for error grows smaller and smaller.
Along with this, in the early 2000, C and C++ were the defacto languages. If you didn’t know these, finding a job would be difficult as very few employers would be using anything else. Only a few years later, with the boom of social and Internet, people needed fast, scalable, and easy languages and platforms in order to build the massive socialized platforms that are in use now. Twitter, Facebook, LinkedIn and most social sites are written in garbage collected, high level languages that do all the memory work and are much easier to scale.
An example would be Twitter. It is an application that was written with Ruby on Rails, and used a MySQL database as its backend. At first this was enough, but as the site grew to the size it is today, they had to explore more open and scalable technologies. While most of the site still runs on Ruby on Rails, and some parts are still on MySQL, they have begun switching to other technologies to handle the massive user base and traffic it sees daily. While Ruby on Rails is a high level language and is one of the many that came from the Internet age, newer and better technologies come out. Twitter has began to switch to Scala on the JVM. Only parts of the system were changed to it, but it allowed for Twitter to scale better with the high loads. Another technology they use is Lucene instead of MySQL for their searches. Lucene is a dedicated search application that can handle searches much better than a large set of MySQL queries. The final technology they employ is BitTorrent. They use its distributive processing nature to allow them to deploy to data centers faster and with error protection. All these would take much more work and code to achieve with lower level programming languages.
There do exist counter examples. A popular online dating site, OkCupid, runs their entire stack on C++. Their server daemons and web server are all custom built using C++. Their site seems to be scaling well and their userbase is growing steadily. Showing that is is possible to write it all in C++, but while you can, it might not be the most effective or scalable means of doing it.
So why has this focus changed? Well while C and C++ can probably handle the loads and features these sites require, the time and difficulty to write the software to do the same thing and scale the same way would be much much larger. Writing the same program in C and C++ would probably take thousands of more lines of code and implementations of coding functions that are already built into other languages. Along with this, with increasing hardware power and resilience, our computers can handle these high level languages with minimal difference from lower level languages. A great example is TinyP2P. It is a P2P program written in 15 lines of Python. Something that would normally take 100s of lines in other programming languages was achieved in 15. This is one of the major reasons, ease of programming.
The final aspect is money. It is much cheaper to pay 2-3 programmers to do the work in a month, than 6+ programmers to do the work in 6 months. There is also hardware and other things that allow for cheaper implementations because of the greater scalability that these languages have. Allows you to run more on less.
Now to turn back a bit, in the early days of programming, most innovation was spurred by just wanting to see what they could do with computers. Programming languages weren’t well defined, and constantly changing. Programmers weren’t programming for consumers per say, but to push the boundaries of computing. As most programmers were also engineers, they had a more intimated knowledge of the workings of the technology, and were able to manipulate them with greater ease on the hardware level. It also allowed them to use rough, very low level languages without too much effort, as they understood what they were doing.
The Consumers Chime In
But things began to shift, computers were starting to appear in more places, and general consumers began to want to do things easier. So focus for programmers and engineers began to shift slightly from breaking new boundaries to make computing easier. Now boundaries were being pushed through the process of making it easier, but it began to cater more to consumer needs. Once personal computing with Macs and PCs began to grow, it just kept shifting. Today, the primary focus of programming is to to create software for consumers. While there are plenty of researchers and programmers that are pushing the boundaries of technology to innovate, but the vast majority of the focus is now on social networks, websites, and software platforms to interconnect the masses and provide them an easy terminal into technology without the need of a vast knowledge of the inner workings of a computer or how any of it works.
Another example is that programming has become a field of its own. Computer Engineer is focused on hardware and programming that hardware. Computer Science has become the theoretical study of algorithms, paradigms, with large doses of math. This has shown that programming is now a whole beast of its own and that it has grown to be more than just controlling the hardware the engineers make. But in the early days, programming innovation and change was spurred by programmers just being fed up with the tools they had. They wanted easier, more powerful, and faster. So they sat down and did it. James Gosling was not happy with C and C++ so he and a group at Sun Microsystems sat down and wrote the Java Virtual Machine and the Java programming language. They spurred the innovation of object oriented programming, and the growth of the Java Virtual Machine to take over C and C++ programming. Today, Java is one of the most popular languages in the world, and the Java Virtual Machine has a home on many different operating systems due to its multi-platform goals.
Conclusion & Predictions
Looking into the future, I can easily see programming growing into something the average user can do, using a drag and drop interface, and just a little bit of reasoning and logic. The technology to this already exists, and there are many projects in their infancy that aim to provide such an interface. The one I have heard most about, is Google's App Inventor, which lets you create an Android App just by dragging and dropping UI elements, then using a graph based logic map to create functionality. The apps they create are not very unique or powerful, but these are the first steps. While I don’t see this type of programming taking over anytime in the near future, and the need for programmers well with Computer Science degrees to write software will still exist. But I do foresee a day, when the average consumer will be the programmer. Then only the limits of the imagination of the masses are our blockades.
Programming is increasingly becoming consumerized or at least, heavily controlled by consumer interests and corporations wanting make money on those interests. But it is not necessarily a bad thing. Without the need to appease consumers, for programmers to make their time programming more enjoyable and fun, these changes are needed, and I don't think they are bad, they are just evolving with the rest of the advances of the world.
- Wikipedia on History of Languages 
- TIOBE Popularity Charts 
- Twitter on Scala 
- TinyP2P on About.com 
- Murder: Fast Data Center Deploys using Bittorrent 
- Various social websites about pages and history.