Subscribe to this blog

Subscribe

Search This Blog

2 Scaling-Up And Scaling-out QlikView's Ideas! That You Can Never Miss

In scale-up architecture, a single server is used to serve the QlikView applications.
In this case, as more throughput is required, bigger and/or faster hardware (e.g. with
more RAM and/or CPU capacity) are added to the same server.

In scale-out architecture, more servers are added when more throughput is needed to
achieve the performance necessary. It is common to see the use of commodity servers in
these types of architectures. As more throughput is required new servers are added, creating
a clustered QlikView environment. In these environments, QlikView Server supports load
sharing of QlikView applications across multiple physical or logical computers. QlikView load
balancing refers to the ability to distribute the load (i.e. end-user sessions) across the cluster
in accordance to a predefined algorithm for selecting which node should take care of a certain
session. QlikView Server version 11 supports three different load balancing algorithms.

#The scle-out QlikView Architecture:

Below is a brief definition for each scheme. Please refer to the QlikView Scalability
Overview Technology white paper for further details.

Random: The default load balancing scheme. The user is sent to a random server, no
matter if QlikView application the user is looking for is loaded or not on a QlikView Server.

Loaded Document: If only one QlikView Server has the particular QlikView application
loaded, the user is sent to that QlikView Server. If more than one QlikView Server
or none of the QlikView Servers have the application loaded, the user is sent to the
QlikView Server with the largest amount of free RAM.

CPU with RAM Overload: The user is sent to the least busy QlikView Server.

Please note that this report does not go into detail on when to use and how to tune different
load balancing algorithms for best performance. Cluster test executions presented in this
report have been run in an environment configured with a better performing scheme for the
certain conditions of a particular test.

Post a Comment

Popular posts from this blog

Total four products. Read the details below.Tableau desktop-(Business analytics anyone can use) - Tableau Desktop is based on breakthrough technology from Stanford University that lets you drag & drop to analyze data. You can connect to data in a few clicks, then visualize and create interactive dashboards with a few more.

We’ve done years of research to build a system that supports people’s natural ability to think visually. Shift fluidly between views, following your natural train of thought. You’re not stuck in wizards or bogged down writing scripts. You just create beautiful, rich data visualizations. It's so easy to use that any Excel user can learn it. Get more results for less effort. And it’s 10 –100x faster than existing solutions.

Tableau server
Tableau Server is a business intelligence application that provides browser-based analytics anyone can use. It’s a rapid-fire alternative to th…

The Credit Card (Shopping): The purpose o this card is to buy any item withing the limit prescribed by banks to cardholder. These cards can have both Magnetic stripe and Chip cards.
Now a days all banks are issuing credit cards with CHIP and PIN. After entering the PIN by cardholder, then transaction starts for further processing.

The debit (ATM, Cash) card is a relatively new method of payment. It is different from a credit card because the debit cardholder pays with the money available in their bank account, which is debited immediately in real time. A debit card seems to be more dangerous compared to a credit card because the debit card is directly linked to the bank checking account and usually allows ATM cash withdrawals.

On the other hand, it is more protected by the required two-factor authentication (PIN number plus card itself). The real dangerous element of many branded debit cards is that they can be processed as credit cards, without entering the PIN.The Gift card is simi…

Why Sqoop you need while working on Hadoop-The Sqoop and its primary reason is to import data from structural data sources such as Oracle/DB2 into HDFS(also called Hadoop file system).
To our readers, I have collected a good video from Edureka which helps you to understand the functionality of Sqoop.

The comparison between Sqoop and Flume

The Sqoop the word came from SQL+Hadoop
Sqoop word came from SQL+HADOOP=SQOOP. And Sqoop is a data transfer tool. The main use of Sqoop is to import and export the large amount of data from RDBMS to HDFS and vice versa. List of basic Sqoop commandsCodegen- It helps to generate code to interact with database records.Create-hive-table- It helps to Import a table definition into a hiveEval- It helps to evaluateSQL statement and display the resultsExport-It helps to export an HDFS directory into a database tableHelp- It helps to list the available commandsImport- It helps to import a table from a database to HDFSImport-all-tables- It helps to import tables …