Subscribe to this blog

Subscribe

Search This Blog

The Story behind Mainframe to Cloud Journey

Mainframe to CLOUD: Mainframe computing took off in the 1950s and gained much prominence through-out the 1960s. Corporations such as IBM (International Business Machines), Univac, DEC (Digital Equipment Corporation), and Control Data Corporation started developing powerful mainframe systems.

These mainframe systems mainly carried out number-crunching for scientists and engineers. The main programming language used was Fortran. Then in the 1960s, the notion of database systems was conceived and corporations developed database systems based on the network and hierarchical data models. The database applications at that time were written mainly in COBOL.

Cloud Vs Mainframe

In the 1970s, corporations such as DEC created the notion of mini-computers. An example is DEC's VAX machine. These machines were much smaller than the mainframe systems. Around that time, terminals were developed. This way, programmers did not have to go to computing centers and use punch cards for their computations. They could use their terminals and submit the jobs to the computing machines. This was a huge step forward. It was also during this time that languages such as C and operating systems such as UNIX were developed.

A significant development in the late 1970s was the emergence of the personal computer. This resulted in Apple Computers. Soon after, IBM developed its own personal computers. Microsoft developed the DOS operating system for these IBM machines. Powerful workstations were developed in the early 1980s by corporations such as Sun Microsystems, Apollo, and HP (Hewlett Packard). Database systems based on the relational data model were developed by corporations such as IBM and Oracle. By the mid-1980s, computers were poised to take over the world.

Distributed Computing

With the invention of the Internet by DARPA (Defense Advanced Research Projects Agency), networked systems gained momentum in the 1970s and the early products came out in the 1980s. Computers were networked together and were communicating with each other and exchanging messages through what is now known as email. Several applications were developed for these distributed systems. The idea was to utilize the resources and carry out a computation in multiple machines. The late 1980s also saw the emergence of parallel computing.

A computing paradigm that exploded in the early 1990s was the distributed object paradigm. Here, computers were encapsulated as objects. This way objects communicated with each other by exchanging messages. This work resulted in consortia such as the Object Management Group [OMG] to be formed. It was at this time that object-oriented languages such as Smalltalk and C++ rose to prominence. Evaluation of WWW

In the early 1990s, one of the major innovations of the twentieth century was initiated and that was the World Wide Web (WWW). Tim Berners Lee, the inventor of the WWW was a programmer at CERN in Geneva, Switzerland. He started a project to support physicists sharing data. This project resulted in the WWW. Around the same time, programmers at the University of Illinois National Computing Center developed the MOSAIC browser. These two innovations resulted in ordinary people using the WWW to query and search for information. The late 1990s saw the emergence of several search engines such as Alta Vista and Lycos. Then two researchers from Stanford University started a company called Google that is now the largest web search company in the world. Java became one of the popular programming languages.

CLOUD Computing

Cloud application

Cloud data layer

Cloud storage layer

Cloud operating system and hypervisor layer

The late 1990s also saw what is now called the dot-com boom. Several companies that provided services were formed and this resulted in electronic commerce. However, the infrastructure technologies were not mature at that time and, as a result, many of these companies did not survive. In the late 1990s and early 2000s, the notion of web services based on the service paradigm was created. With the service technologies, better infrastructures were built for e-commerce. Corporations were providing services to the consumer based on the service paradigm.

Developments in services computing, distributed computing, and the WWW have resulted in cloud computing. The idea was to provide computing as a service just like we use electricity as a service. That is, a cloud service provider would provide different levels of service to the consumer. The service could be to use the cloud for computing, for database management, or for application support such as organizing one's finances.

Post a Comment

Popular posts from this blog

Total four products. Read the details below.Tableau desktop-(Business analytics anyone can use) - Tableau Desktop is based on breakthrough technology from Stanford University that lets you drag & drop to analyze data. You can connect to data in a few clicks, then visualize and create interactive dashboards with a few more.

We’ve done years of research to build a system that supports people’s natural ability to think visually. Shift fluidly between views, following your natural train of thought. You’re not stuck in wizards or bogged down writing scripts. You just create beautiful, rich data visualizations. It's so easy to use that any Excel user can learn it. Get more results for less effort. And it’s 10 –100x faster than existing solutions.

Tableau server
Tableau Server is a business intelligence application that provides browser-based analytics anyone can use. It’s a rapid-fire alternative to th…

The Credit Card (Shopping): The purpose o this card is to buy any item withing the limit prescribed by banks to cardholder. These cards can have both Magnetic stripe and Chip cards.
Now a days all banks are issuing credit cards with CHIP and PIN. After entering the PIN by cardholder, then transaction starts for further processing.

The debit (ATM, Cash) card is a relatively new method of payment. It is different from a credit card because the debit cardholder pays with the money available in their bank account, which is debited immediately in real time. A debit card seems to be more dangerous compared to a credit card because the debit card is directly linked to the bank checking account and usually allows ATM cash withdrawals.

On the other hand, it is more protected by the required two-factor authentication (PIN number plus card itself). The real dangerous element of many branded debit cards is that they can be processed as credit cards, without entering the PIN.The Gift card is simi…

Why Sqoop you need while working on Hadoop-The Sqoop and its primary reason is to import data from structural data sources such as Oracle/DB2 into HDFS(also called Hadoop file system).
To our readers, I have collected a good video from Edureka which helps you to understand the functionality of Sqoop.

The comparison between Sqoop and Flume

The Sqoop the word came from SQL+Hadoop
Sqoop word came from SQL+HADOOP=SQOOP. And Sqoop is a data transfer tool. The main use of Sqoop is to import and export the large amount of data from RDBMS to HDFS and vice versa. List of basic Sqoop commandsCodegen- It helps to generate code to interact with database records.Create-hive-table- It helps to Import a table definition into a hiveEval- It helps to evaluateSQL statement and display the resultsExport-It helps to export an HDFS directory into a database tableHelp- It helps to list the available commandsImport- It helps to import a table from a database to HDFSImport-all-tables- It helps to import tables …