Thursday, August 28, 2014

QV15 Hierarchy In QLIKVIEW

QLIKVIEW wow, a great tool again. Solve the big problem of hanling the hierarchical data.
We have two function at all, which can be used at script level
 - hierarchy(upto 8 variables can pass)
 - hierarchybelongsto( 6 variables can pass.)


both function can use in front of LOAD statement or SELECT.

hierarchy:
it convert the node adjacent table to , in that way each level of child record will written in separate field.

Lets do some handsome practice on table - scott.emp !!!!!!!!

create connection
load the table
below the script you direct copy paste to your script editor after adding connection, i have three set of queries we will see the behaviour of each set of queries one by one:

//HIERARChy(EMPNO,MGR,ENAME)
//LOAD EMPNO,
// ENAME,
// JOB,
// MGR,
// SAL;
//SQL SELECT *
//FROM SCOTT.EMP;



 
:::
HIERARChy(EMPNO,MGR,ENAME)
it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.


HIERARChy(EMPNO,MGR,ENAME,asdf,ENAME,[hierarchy_goeg], ';', 'HIERARCHY DEPTH')
LOAD EMPNO
,
ENAME
,
JOB
,
MGR
,
SAL
;
SQL
SELECT *
FROM SCOTT.EMP;





:::
HIERARChy(EMPNO,MGR,ENAME,asdf,ENAME,[hierarchy_goeg], ';', 'HIERARCHY DEPTH')it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.
4. asdf - i use any dummy name .. either we can use the parent record description related field
5. ENAME - PathSource: The path in QlikView is a string containing one field per ancestor down to the node.
6. PathName: the name of the field that will contain the path.
7. Delimiter: the letter to separate the different fields
8. HIERARCHY DEPTH -the name of the field that will contain the depth of the node.





hierarchybelongsto:
The hierarchybelongsto prefix is used to transform a hierarchy table adjacent node table. Adjacent node table is the table where each record corresponds to a node and has a field that contains a reference to the parent node. (means table will store data in parent child relation help of two fields, parent record repeated over row untill all child has been written corespond to parent. see the below image first two field behaviour, it is a adjacent node table)
:::
HierarchyBelongsTo(EMPNO,MGR,ENAME, 'ANCESTORS_KEY','ANCESTORS_NAME', 'Depth')
it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.
4. write parent record in field
5. write child record in field
6. Depth - depth number of record (0 for child node, summing up with 1 for roots)


geo2:
HierarchyBelongsTo(EMPNO,MGR,ENAME, 'ANCESTORS_KEY','ANCESTORS_NAME', 'Depth')
LOAD EMPNO
,
ENAME
,
JOB
,
MGR
,
SAL
;
SQL
SELECT *
FROM SCOTT.EMP;





 

Thursday, August 21, 2014

QV14 Peek() used in for loop

Peek() function can also most usable in for loop in qlikview.
Lets have a look on a little example, I have google-ed this example and perform on my local machine.

In the example, we have a table "FileListTable:" with two fields Date1 and FilNme. FilNme i.e. Filename all are exists on the loacal harddisk, we have to load each file and we can perform any type of string operation on loaded file.
In for loop:
1. NoOfRows return total number of rows in table
2. for loop work from bottom to up. As the logic says "vFileNo-1", Total no. of rows-1
 i.e.  11-1 = 10 return -> Airline Operations_ch8.qvw.2014_07_28_12_18_30.log
   10-1 = 9   Airline Operations_ch8.qvw.2014_07_28_11_34_41.log
   9-1 = 8    Airline Operations_ch8.qvw.2014_07_25_16_51_02.log
   ....
   2-1 = 1    Airline Operations_ch8.qvw.2014_07_25_15_38_06.log
   1-1 = 0    Airline Operations_ch8.qvw.2014_07_25_15_36_35.log

3. In loop we are creating New file name which will same the log file in the form of text file.
4. Store all log file in form of "*.txt"







FileListTable:
Load * inline
[
Date1, FilNme
18-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_36_35.log
25-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_38_06.log,
26-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_39_08.log,
27-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_50_00.log,
19-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_50_37.log,
20-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_53_15.log,
21-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_57_28.log,
22-07-2014, Airline Operations_ch8.qvw.2014_07_25_16_48_30.log,
23-07-2014, Airline Operations_ch8.qvw.2014_07_25_16_51_02.log,
24-07-2014, Airline Operations_ch8.qvw.2014_07_28_11_34_41.log,
28-07-2014, Airline Operations_ch8.qvw.2014_07_28_12_18_30.log
]



For vFileNo = 1 to NoOfRows('FileListTable')
      Let vFilNme = Peek('FilNme',vFileNo-1,'FileListTable');
       xyz: Load *,'$(vFilNme)' as FilNme
       From
[$(vFilNme)];      
       let vfilenme =replace(vFilNme,'.log','.txt') ;
       store * from xyz into $(vfilenme);
Next vFileNo

 

QV13 Peek() and previous()

the Peek() function allows the user to look into a field that was not previously loaded into the script
whereas the
Previous() function can only look into a previously loaded field.


Both are Inter Record Functions.These functions are used when a value from previously loaded records of data is needed for the evaluation of the current record.  IT loads the value from currently evaluating column. which is being in process and look into the previous value for that column.

peek(fieldname [ , row [ , tablename ] ] )
Returns the contents of the fieldname in the record specified by row in the internal table tablename. Data are fetched from the associative QlikView database.
Fieldname must be given as a string (single quote).

Row must be an integer.
  • 0 denotes the first record,
  • 1 the second and so on.
  • Negative numbers indicate order from the end of the table. -1 denotes the last record read.
  • If no row is stated, -1 is assumed.
     
If no tablename is stated, the current table is assumed.


previous(expression )
Returns the value of expression using data from the previous input record. In the first record of an internal table the function will return NULL because no previous record exists for first value.

The previous function may be nested.
Data are fetched directly from the input source, making it possible to refer also to fields which have not been loaded into QlikView, i.e. even if they have not been stored in its associative database.

Some similarities and differenceThe Similarities
 - Both allow you to look back at previously loaded rows in a table.
 - Both can be manipulated to look at not only the last row loaded but also previously loaded rows.
The Differences - Previous() operates on the Input to the Load statement, whereas Peek() operates on the Output of the Load statement. (Same as the difference between RecNo() and RowNo().) This means that the two functions will behave differently if you have a Where-clause.
 - The Peek() function can easily reference any previously loaded row in the table using the row number in the function  e.g. Peek(‘Employee Count’, 0)  loads the first row. Using the minus sign references from the last row up. e.g. Peek(‘Employee Count’, -1)  loads the last row. If no row is specified, the last row (-1) is assumed.  The Previous() function needs to be nested in order to reference any rows other than the previous row e.g. Previous(Previous(Hires))  looks at the second to last row loaded before the current row.

So, when is it best to use each function?    The previous() and peek() functions could be used when a user needs to show the current value versus the previous value of a field that was loaded from the original file. 
    The peek() function would be better suited when the user is targeting either a field that has not been previously loaded into the table or if the user needs to target a specific row.

source http://community.qlik.com/blogs/qlikviewdesignblog/2013/04/08/peek-vs-previous-when-to-use-each

//example1

DR:
LOAD F1, F2, Peek(F1) as PeekVal, Peek('F1',1,'DR') as PeekVal1, Previous(F2) as PrevVal ;
// Where F2 >= 200;
LOAD * INLINE
[
F1, F2
A, 100
B, 200
C, 150
D, 320
E, 222
F, 903
G, 666
]
;



//example2

data of file "9_Peek vs Previous.xlsx"
Date Hired Terminated
1/1/2011 6 0
2/1/2011 4 2
3/1/2011 6 1
4/1/2011 5 2
5/1/2011 3 2
6/1/2011 4 1
7/1/2011 6 2
8/1/2011 4 1
9/1/2011 4 0
10/1/2011 1 0
11/1/2011 1 0
12/1/2011 3 2
1/1/2012 1 1
2/1/2012 1 0
3/1/2012 3 1
4/1/2012 3 4
5/1/2012 2 1
6/1/2012 1 0
7/1/2012 1 0
8/1/2012 3 0
9/1/2012 4 3
10/1/2012 0 3
11/1/2012 2 1
12/1/2012 0 0
1/1/2013 4 2
2/1/2013 2 2
3/1/2013 3 0


[Employees Init]:


LOAD
rowno() as Row
,
Date(Date) as Date
,
Hired
,
Terminated
,
If(rowno()=1, Hired-Terminated, peek([Employee Count], -1)+(Hired-Terminated)) as [Employee Count]

From
[..\datafiles\9_Peek vs Previous.xlsx]
(ooxml, embedded labels, table is
Sheet2);

[Employee Count]:
LOAD

Row
,
Date
,
Hired
,
Terminated
,
[Employee Count]
,
If(rowno()=1,0,[Employee Count]-Previous([Employee Count])) as [Employee Var]

Resident [Employees Init] Order By Row asc;

Drop Table
[Employees Init];




image explanation
1. hierd - no of employee hiered
2. Terminated - no. of emp terminated
3. [Employee count] - accumalation(no of working employee - terminated employees)
we used peek function to get the desire result
4. [Employee Var] - employee variation from field [Employee count] using previou().

//now if you observe that we use previous() in different load query, because previous work on pre-loaded field.
//If we try to use previous() in load table "[Employees Init]" then it will not work, because "[Employee Count]" is being calculated in "[Employee Count]" load statement.

Wednesday, August 20, 2014

QV12 Above() Below() function in qlikview

above and below lie in the Chart Inter Record Functions category. These functions can only be used in chart expressions.

SYNTAX is:
above([ total ] expression [ , offset [,n ]])

It return the value from expression, which is previously seen in the execution.
In other words it return the value the row appear above the current row in table chart.

where we can use above function
- need to create pivot table [accumalation of pivot table]
- need to get the difference and addition of sales (diff. according to month year any perspective) i.e. accumalative sum
Below is the example with some sample data and we will see the behaviour of above() function in straigth table as well as in pivot table.

ex:
LOAD * INLINE [
sno, no
1,20
2,578
3,344
4,54
5,34
6,34
7,34
8,34
]
;

In chart properties add the below expressions:
- sum(no)
- above(no) ~~ above(no,1)
- below(no,1)
- sum(aggr(rangesum(above(total no)),sno))

Reload the document and see the behaviour of table chart.


image



picturization:
1. sum(no) - will give total sumup of expression
2. above(no) ~~ above(no,1)
as you can see i have written above(no) ,above(no,1) two different entities and but working is same for both. If we skip the offset by default qlikview assume 1.
And see the image above the behaviour is as it start write value from 2 row, by skipping 1, because offset number is one. If we replace it with two it will start writing from 3rd row.

- and you can see the first row having null value, because it there is no value above 20.
3. below(no,1) - It start writing in the column the value below in current expression.

Now we can see the image that the column above(no) and below(no) not summing up the column.
4. to get the sum for the column we need to do "rangesum" with "aggr()" function.



Above() behaviour with pivot table:
script the below inline table

abve:
LOAD * INLINE
[
Customer, FK, Sales
A, 1, 50
A, 2, 45
A, 3, 25
A, 4, 75
B, 1, 35
B, 2, 75
B, 3, 12
B, 4, 14
C, 1, 12
C, 2, 50
C, 3, 45
C, 4, 65
D, 5, 45
D, 6, 76
D, 7, 12
E, 1, 12
E, 2, 15
F, 1, 25
F, 2, 56
F, 3, 87
F, 4, 98
F, 5, 54
G, 1, 12
G, 2, 34
G, 3, 15
H, 1, 15
H, 2, 65
H, 3, 15
H, 4, 12
]
;


LOAD * INLINE
[
TestVal
1
2
5
6
7
]
;

Dimension: - Customers, FK
Expressions:
- Sum (Sales)
pivot_column - =above( column(1))
total_sum - =Above(TOTAL Column(1)) + If(Above(TOTAL Column(3)) > 0, Above(TOTAL Column(3)), 0)

sum customer wise - rangesum(ABOVE(column(1), 1, ROWNO()))

reload and create a pivot table chart.




Tuesday, August 19, 2014

QV11 Only() function in qlikview

Only function work return the only one value when the value in parameter in common with respect to dimension else it will return null. It always return single value. 

pricem:
Load * inline [
Month, Price
1, 345
1, 342
1, 34
1, 345
1, 67
2, 76
2, 87
2, 67
2, 33
3, 321
3, 123
3, 324
3, 274
4, 33
4, 49
4, 192
5, 923
5, 347
5, 43
5, 84
5, 52
5, 55
6, 10
6, 12
6, 554
6, 546
7, 22
7, 10
7, 20
8, 30
8, 40
9, 50
10, 6
10, 60
10, 70
11, 80
12, 90
12, 90
]
;


load Month, Only(Price) Resident pricem group by Month
;


Using above query it will return value for 9,11,12 month only


This is where the Only() function affects the calculation without the user knowing it; if there is no explicit aggregation function in the expression, QlikView uses the Only() function implicitly. Hence, in some cases,

- “Only(Date)” is used as sort expression
- “=Only(Product)='Shoe' ” is used as search criterion.

case will be updated soon

QV10 Cross table

What is cross tables:
Cross tables are like that any 2-D sheet can be excel sheet or any feedback form. Where facts are divided on the basis of dimensions.

good example for this:
- sales record in excel sheet, month column is split over the column.
- any feedback form month wise arrange in a file. [feedback form for quality work upon various categories.]

Dire: LOAD Salesman,
Year
,
Jan
,
Feb
,
Mar
,
Apr
,
May
,
Jun

FROM
[..\datafiles\7_cross_table.xls]
(
biff, embedded labels, table is Sheet4$);

 
(example of cross table)

Why we use cross-table:
In qlikview if you have face the table like above, and if you trying to create the annual report on the basis of year which will act as a dimension.

So to create a meaningful report we need to denormalize  [means need to move column to rows]  the table, which is also called cross-table in qlikview.

For above screen shot , calculative data is split over the column which can-not be easily sumable w.r.t. dimension.





(We need to use "cross table " keyword to change the table orientation)
 
 
Dire:
CrossTable(category, Data,2)
LOAD Salesman,
Year,
Jan,
Feb,
Mar,
Apr,
May,
Jun
FROM
[..\datafiles\7_cross_table.xls]
(biff, embedded labels, table is Sheet4$);

 *where - category is column name for jan-june columns

Below is chart created with cross table "Dire"




Friday, August 8, 2014

QV9 Dynamic Update

Again a great functionality in qlikview is Dynamic update. Through this we can write back to the dataset of qlikview. The written changes can be reflected in qlikview data model. Logically later we can write these changes can be written to qvd or xls or any other source file.
Below is my report UI


sub Insert
 set Result = ActiveDocument.DynamicUpdateCommand("INSERT INTO TEST  (SRNO,TYPE) VALUES (4,A)")
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub Update
 SQL_String="UPDATE TEST SET SR = 2 WHERE SRNO=3;"
  SET Result = ActiveDocument.DynamicUpdateCommand (SQL_String)
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub Delete
 SQL_String="Delete from TEST where SRNO =4;"
  SET Result = ActiveDocument.DynamicUpdateCommand (SQL_String)
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub SaveTableToFile
set obj = ActiveDocument.GetSheetObject("CH01")
obj.ExportEx "..\datafiles\6_dynamic_update.qvd", 4
end sub


INSERT INTO TEST (SRNO,TYPE) VALUES ($(vCnt), A $(vCnt))
UPDATE TEST SET CDATE = TODAY() WHERE SRNO=$(vCnt);
DELETE FROM TEST WHERE SRNO  = $(vCnt)

--script editor code copy and paste in below tab
// execute only once then comment it
//LOAD SRNO,
//    CDATE,
//    TYPE,SR;
//SQL SELECT *
//FROM SCOTT.TEST;
//
//
//store TEST into ..\datafiles\6_dynamic_update.qvd;

TEST:
LOAD SRNO,
     CDATE,
     TYPE,
     SR
FROM
[..\datafiles\6_dynamic_update.qvd]
(qvd);

set vCnt = 1;
 

Thursday, August 7, 2014

QV8 Partial Reload

Partial loading - as name specify the partial load instead of doing incomplete load, we will do incomplete load. Some times happen that we made changes to the database. The changes in the form of
- Create a new table
- New attributes added in the table
Now we want that table and new fields should reflect in qlikview datamodel. So Generally we do - we do reload off document.
But it is not a good deal if reload completes in hours. It will waste all the time.
So to save the time QlikView provide the partial load - Instead of reload we will load only those tables which had been modified in the database. This approach is called partial reload.
Advantage-
- While partial reload all other load statement and scripts remain untouched.
 Means partial will execute only those command where it find "ADD", "REPLACE" as a prefix to LOAD statement.

How partial Reload can be intialize
- ctrl+shift+r
- File > partial reload.

Two Approaches are there:
- use "ADD", "REPLACE" in front of LOAD statement (partial reload and reload both can execute this statement)
- We can also use "ADD ONLY", "REPLACE ONLY" instead of "ADD", "REPLACE"
 By adding "ONLY" , then the command will execute only and only when we do partial Reload of document. Reloading of document will not affect these Load statements.

- AND we can use IsPartialReload() function:
 it will return -1 --- when we do partial reload  (ctrl+shift+r)
   return 0 ---- when we do reload (ctrl+r)

  
--With partialreload function .. put below code in script editor  
if (IsPartialReload() = -1 ) then
add LOAD GRADE,
    LOSAL,
    HISAL;
SQL SELECT *
FROM SCOTT.SALGRADE;
end if;


Above code will be execute only and only if we do partial reload.


-- With and ADD and REPLACE keyword
//below commented statement is load once with (ctrl +R), it load 3 column in data model.
//dasf:
//replace
//LOAD SRNO,
//    CDATE,
//   TYPE;
//   SQL SELECT *
//FROM SCOTT."TEST";


dasf:
replace only
LOAD SRNO,
    CDATE,
   TYPE ,SR;
   SQL SELECT *
FROM SCOTT."TEST";


fligh:
add only
LOAD "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1";


- Now save it and do partial reload (ctrl+shift+r)

It is suggest to check the Data model along with every reload.

QV7 Generic load in qlikview

A generic table is commonly used to store attribute values for different objects.
eg. ball and box is the example

These attributes are not necessarily shared across all objects. Objects might have common field or not.
Let have we have two objects ball and box
ball attribute - color , weigth , diameter
box attribute - color , weigth , length, heigth, width

Here we can we color and weigth are common field between two objects. And the attributes and its values store in a single table as below.




Enter the Generic prefix.
It converts the data to a structure where each attribute is placed in a field of its own (each attribute now act as a an object entity). Another way to express it is to say that it takes field values and converts these to field names. If you compare it to the Crosstable prefix, you will find that they in principle are each other’s inverses.


The syntax is
   Generic Load Key, Attribute, Value From … ;


There are however a couple of things worth noting:
    Usually the input data has three columns: one qualifier field (Key in the above example), an Attribute and a Value. But you may also have several qualifying fields. If you have four or more columns, all columns except the two last will be treated as qualifying fields.
    The Generic prefix will create several tables; one table per attribute. This is normally not a problem. Rather, it is an advantage: It is the least memory-consuming way to store data if you have many attributes.


Now open qlikview and create new document save it. and open script editor
// generic load with one consolidated resulting table:
InputTable:
LOAD * INLINE [
object,attribute,value
ball,color,red
ball,diameter,10 cm
ball,weight,100 g
box,color,black
box,height,16 cm
box,length,20 cm
box,weight,500 g
box,width,10 cm
];
 

GenTable:
Generic Load object, attribute, value Resident InputTable;
 



 
after reload check data model


Now as you can see in image, every attribute act as its own object entity.

after search alot on google i have found that the use of generic load is very rare. It can be use only when we want to change the orientation of the table.





Paste below code in same script editor to made orientation in table.
ResultTable:
LOAD Distinct object Resident InputTable;
 

FOR i = 0 to NoOfTables()
 TableList:
 LOAD
  TableName($(i)) as Tablename AUTOGENERATE 1
 WHERE
  WildMatch(TableName($(i)), 'GenTable.*');
NEXT i


FOR i = 1 to FieldValueCount('Tablename')
 LET vTable = FieldValue('Tablename', $(i));
 LEFT JOIN (ResultTable) LOAD * RESIDENT [$(vTable)];
 DROP TABLE [$(vTable)];
NEXT i

source -
http://community.qlik.com/blogs/qlikviewdesignblog/2014/03/31/generic

Wednesday, August 6, 2014

QV6 Preceding Load

What is Preceding load
- preceding load allow you define multiple transformations and calculations within one load script,
if it is not present we need to create another load or resident statement.

BENIFITS, WHY PROCEDING LOAD
- we can use multiple load using previous loaded select
- Advantage of Preceding Load is that it allows you to use QlikView functions in the Load script
  (on other hand you can not use qlikview function in SQL load query)
- USED FOR TRANSFORMATION and calculations PURPOSE
- It is also suggested to use preceding load to some scenerio, in which the qlikview function are using again and again like below example,
 1st load statement from below using date function and 2nd load statement to get the difference.
 We can also write the '1st load statement from below' as

LOAD date(CREATED_DATE,'D-MMM-YYYY') AS  [CREATED DATE] ,Today(1) AS TODAY_DATE , date(UPDATED_DATE,'D-MMM-YYYY') AS [UPDATED DATE],Today(1)-date(UPDATED_DATE,'D-MMM-YYYY') AS DIFF;


Which cause the overhead for qlikview and we are processing the data, which is calculated primarily.



1. Open Qlikview, create new document and save it.
2. Press ctrl+e (now you can see script editor)
3. create the respective ODBC connection to the database having table "flight_Data_1"
4. You can get "FLIGHT_DATA_1" create table script from post http://j4info.blogspot.in/2014/08/qv5-incremental-loading.html
5. Copy and paste below code in script editor


LOAD "PRIMARY_KEY", IF ([DAYS DIFF] > 170 , 'YES','NO'),[DAYS DIFF];
LOAD "PRIMARY_KEY", TODAY_DATE - [UPDATED DATE] AS [DAYS DIFF];
LOAD "PRIMARY_KEY",date(CREATED_DATE,'D-MMM-YYYY') AS  [CREATED DATE] ,Today(1) AS TODAY_DATE , date(UPDATED_DATE,'D-MMM-YYYY') AS [UPDATED DATE];
SQL SELECT
 "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT"
FROM SCOTT."FLIGHT_DATA_1";



Now you can see in above code we have three preceding load statements. Preceding load statement execute below to up manner and order of execution is get by qlikview itself it search for the "from /Resident " clause for respective load statement until it get.
 - it load table from database using SQL SELECT query
 - First load statement converting raw date to actual date format.
 - 2nd load statement getting date difference between today date and updated date.
 - 3rd Load statement using if condition if DATEDIFF > 170 then return "yes" else "no"

 

Sunday, August 3, 2014

QV5 Incremental loading

We have different type of loading with Qlikview. Increamental loading is one of them.

For incremental loading the QVD is necessary. Without QVD incremental loading is nothing.

First of all the qustion arise why the Incremental Loading is necessary?

So the answer to this question is
suppose we have multi million records in the table, and the size of table can grown upto 50GB. So we are not going to load data not from every time from the table. To save time of loading and processing up of data the faster way is QVD. (QVD is 90% faster as from any other source.).
So we load all data from table into QVD and we qlikview fetch data from QVD all the time.

On other hand, same time the database is growing and our QVD file become outdated and we need to add updated data into the QVD. To perform this we will use INCREMENTAL LOADING.


Below i am having the scripts for the incremental loading :
1. insert only
2. update only
3. update and insert both

here the most sable technique is insert only.


below is the create table script (for oracle):
 
create table flight_data_1 (primary_key number,created_date date ,updated_date date ,Year number ,Month number,Aircraft_Type_ID number,Departures_Scheduled number, Departures_Performed number, Available_Seats number, Transported_Passengers number, Transported_Freight number);

truncate table flight_data_1;
drop sequence s2;
create sequence s2;
select s2.nextval from dual;
insert into flight_data_1  values ( 1, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR') ,2009,2,631,35441,32893,2169889,1568806,116024);
insert into flight_data_1  values ( 2, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR') ,2009,2,631,35441,32893,2169889,1568806,116024);
insert into flight_data_1  values (3, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2009,3,35,1584,15632,82553,22820,574571);
insert into flight_data_1  values ( 4, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,2,456,8037,8146,273340,124917,64695);
insert into flight_data_1  values ( 5, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,2,817,0,885,84568,53799,94621719);
insert into flight_data_1  values ( 6, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,2,612,66780,68379,9265104,6895315,16661790);
insert into flight_data_1  values ( 7, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,3,625,3548,3960,394976,322208,101303920);
insert into flight_data_1  values ( 8, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,3,691,6964,6968,900,489,440725266);
insert into flight_data_1  values ( 9, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,2,614,36858,39681,6281400,4697543,11508820);
insert into flight_data_1  values ( 10, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2009,3,627,5542,9441,2504177,1917606,184951562);



--insert ---------------2010\
insert into flight_data_1  values ( 11, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,2,631,35441,32893,2169889,1568806,116024);
insert into flight_data_1  values (12,  to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,3,35,1584,15632,82553,22820,574571);
insert into flight_data_1  values ( 13, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,2,456,8037,8146,273340,124917,64695);
insert into flight_data_1  values ( 14, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,2,817,0,885,84568,53799,94621719);
insert into flight_data_1  values ( 15, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,2,612,66780,68379,9265104,6895315,16661790);
insert into flight_data_1  values ( 16, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,3,625,3548,3960,394976,322208,101303920);
insert into flight_data_1  values ( 17, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,3,691,6964,6968,900,489,440725266);
insert into flight_data_1  values ( 18, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,2,614,36858,39681,6281400,4697543,11508820);
insert into flight_data_1  values ( 19, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,3,627,5542,9441,2504177,1917606,184951562);
insert into flight_data_1  values ( 20, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2010,3,694,47487,50816,7533855,6181092,7007363);

commit;


----insert/update ----------
select s2.nextval from flight_data_1;
update flight_data_1 set DEPARTURES_SCHEDULED = 9191 , updated_date = to_date((sysdate-200)+s2.nextval,'DD-MON-RR') where primary_key = (select max(primary_key) from flight_data_1);


insert into flight_data_1  values ( 21, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,6,631,40093,38756,2546743,2067813,48786);
insert into flight_data_1  values ( 22, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,5,636,0,9,63,26,0);
insert into flight_data_1  values ( 23, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,4,626,10381,13826,2555544,2060087,296679796);
insert into flight_data_1  values ( 24, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,4,614,40138,44213,7006568,5598304,13376786);
insert into flight_data_1  values ( 25, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,4,694,46349,49157,7294399,5917732,6501940);
insert into flight_data_1  values ( 26, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,4,612,77015,79695,10810602,8346921,18806235);
insert into flight_data_1  values (27, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,4,619,32609,32096,4372298,3218725,7872327);
insert into flight_data_1  values ( 28, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,5,405,9617,14258,230490,102487,1257004);
insert into flight_data_1  values ( 29, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,5,406,0,478,4302,2006,11752);
insert into flight_data_1  values ( 30, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'),2011,5,614,40901,44135,6986832,5681758,14001919);

commit;


--update  -----------
update flight_data_1 set Departures_Scheduled=7777 , updated_date = to_date((sysdate-200)+s2.nextval,'DD-MON-RR') where created_Date = (select max(created_date) from flight_data_1);
insert into flight_data_1  values ( 31, to_date((sysdate-200)+s2.nextval,'DD-MON-RR'),to_date((sysdate-200)+s2.currval,'DD-MON-RR'), 2012,7,622,44864,45432,7437140,6623247,162142678);

commit;


Now open Qlikview
- create new document
- save it
- start copy and paste below scripts tab wise
- i have define 5 tabs (main, LoadMainQVD,insert, update, insert_update)
- save qvd and reload it.
- dont forget to create new ODBC connection.


-----------------------


-----------------------
--main

SET vLoadMainQVD= 'N';
SET vInsert= 'N';
SET vUpdate= 'N';
SET vInsertUpdate= 'Y';



------------------
------------------
---LoadMainQVD
IF '$(vLoadMainQVD)' = 'Y' THEN

ODBC CONNECT32 TO [scott;DBQ=ZA1 ] (XUserId is WDXfTZNMPDdIHfC, XPassword is HFAIEaFNHLZEHDA);


flights_2009:
LOAD "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1";



store flights_2009 into ..\datafiles\base_load_date_basis_qvd.qvd;


MaxDateLoad:
LOAD Max(date("CREATED_DATE",'D-MMM-YYYY')) as MaxDate
FROM ..\datafiles\base_load_date_basis_qvd.qvd (qvd);

Let MaxID = Null;
Let MaxID = peek('MaxDate',0,MaxDateLoad);
endif;


----------------------------
-----------------------
---insert
IF '$(vInsert)' = 'Y' THEN

// Get maxdate from table .that will be used for newly added record
MaxDateLoad:
LOAD Max(date("CREATED_DATE",'D-MMM-YYYY')) as MaxDate
FROM
[..\datafiles\base_load_date_basis_qvd.qvd]
(qvd);



Let MaxID =null;
Let MaxID = peek('MaxDate',0,MaxDateLoad);



ODBC CONNECT32 TO [scott;DBQ=ZA1 ] (XUserId is bOfCeZNMPDdIHTB, XPassword is bLOCHaFNHLZEHAB);


IncrementalData:
LOAD "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1"
where "CREATED_DATE" > '$(MaxID)';


CONCATENATE(IncrementalData)
LOAD PRIMARY_KEY,
    CREATED_DATE,
     UPDATED_DATE,
     YEAR,
     MONTH,
     AIRCRAFT_TYPE_ID,
     DEPARTURES_SCHEDULED,
     DEPARTURES_PERFORMED,
     AVAILABLE_SEATS,
     TRANSPORTED_PASSENGERS,
     TRANSPORTED_FREIGHT
FROM
[..\datafiles\base_load_date_basis_qvd.qvd]
(qvd);


store IncrementalData into [..\datafiles\base_load_date_basis_qvd.qvd];


MaxDateLoad_AfterAppend:
LOAD Max(date("CREATED_DATE",'D-MMM-YYYY')) as MaxDate1
FROM [..\datafiles\base_load_date_basis_qvd.qvd] (qvd);

Let MaxID_AftrAppnd = Null;
Let MaxID_AftrAppnd = peek('MaxDate1',0,MaxDateLoad_AfterAppend);

end if;


----------------
--------------------
--update
IF '$(vUpdate)' = 'Y' THEN

MaxDateLoad:
LOAD
    Max(date("UPDATED_DATE",'D-MMM-YYYY')) as MaxUpdatedDate,
    Max(date("CREATED_DATE",'D-MMM-YYYY')) as MaxCreatedDate
FROM
    [..\datafiles\base_load_date_basis_qvd.qvd]
(qvd);

Let MaxUpdatedDate = Null;
Let MaxUpdatedDate = peek('MaxUpdatedDate',0,MaxDateLoad);

Let MaxCreatedDate = Null;
Let MaxCreatedDate = peek('MaxCreatedDate',0,MaxDateLoad);

ODBC CONNECT32 TO [scott;DBQ=ZA1 ] (XUserId is bOfCeZNMPDdIHTB, XPassword is bLOCHaFNHLZEHAB);


IncrementalData:
LOAD  "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1"
where "CREATED_DATE" <= '$(MaxCreatedDate)'
and "UPDATED_DATE" > '$(MaxUpdatedDate)' ;




LOAD PRIMARY_KEY,
    CREATED_DATE,
     UPDATED_DATE,
     YEAR,
     MONTH,
     AIRCRAFT_TYPE_ID,
     DEPARTURES_SCHEDULED,
     DEPARTURES_PERFORMED,
     AVAILABLE_SEATS,
     TRANSPORTED_PASSENGERS,
     TRANSPORTED_FREIGHT
FROM
[..\datafiles\base_load_date_basis_qvd.qvd]
(qvd)
WHERE NOT Exists (PRIMARY_KEY)
;

store IncrementalData into [..\datafiles\base_load_date_basis_qvd.qvd];
end IF;


---------------
---------------
---insert_update
IF '$(vInsertUpdate)' = 'Y' THEN



MaxDateLoad:
LOAD
    Max(date("CREATED_DATE",'D-MMM-YYYY')) as MaxDate,
    Max(date("UPDATED_DATE",'D-MMM-YYYY')) as MaxUPDate
FROM [..\datafiles\base_load_date_basis_qvd.qvd] (qvd);

Let MaxID = Null;
Let MaxID = peek('MaxDate',0,MaxDateLoad);


Let MaxUPdated = Null;
Let MaxUPdated = peek('MaxUPDate',0,MaxDateLoad);


ODBC CONNECT32 TO [scott;DBQ=ZA1 ] (XUserId is bOfCeZNMPDdIHTB, XPassword is bLOCHaFNHLZEHAB);

//FETCH NEWLY INSERTED DATA
IncrementalData:
LOAD "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1"
where "CREATED_DATE" > '$(MaxID)' or UPDATED_DATE >='$(MaxUPdated)';

Concatenate (IncrementalData)
LOAD PRIMARY_KEY,
     CREATED_DATE,
     UPDATED_DATE,
     YEAR,
     MONTH,
     AIRCRAFT_TYPE_ID,
     DEPARTURES_SCHEDULED,
     DEPARTURES_PERFORMED,
     AVAILABLE_SEATS,
     TRANSPORTED_PASSENGERS,
     TRANSPORTED_FREIGHT
FROM [..\datafiles\base_load_date_basis_qvd.qvd] (qvd) WHERE NOT Exists (PRIMARY_KEY) ;


store IncrementalData into ..\datafiles\base_load_date_basis_qvd.qvd;


end if;
web stats