Tuesday, November 4, 2014

QV20 Filter the null values of column in set analysis

I have SCOTT.EMP table with us, and want to filter out null values .
the basic requirement is to filter the data using set analysis whose MGR is null.

Dimension - MGR
expression - SUM (SAL)



Now i want to remove null's from dimension of chart (i.e. sum= 5000 where MGR is null),
i can do in various ways but i want to use using "set analysis"


expression - sum({<MGR-={'=Len(Trim(MGR))=0'} >} [SAL])

QV19 Replace all null values

Null values in database can lead a big problem while calculating the dimensions.
In other word we can say that it is not good if database having null values in it.

It is good if we remove all null values. in Qlikview You can set a variable which will replace all null value with some string.

In very start of script/ Main tab of your document you need to add

NullAsValue *;
Set NullValue = 'NULL' ;

and then reload the document.

Below is the image will show data behaviour "COMM" column before and after adding variable.

Thursday, October 30, 2014

QV18 Duplicate values in dimension

Duplicate values in dimensions:




produce issue:
create table region (area varchar2(30), district varchar2(30), city varchar2(30));
insert into region values ('chandigarh', 'chandigarh', 'chandigarh');
insert into region values ('punjab', 'patiala', 'patiala');
insert into region values ('punjab', 'patiala', 'nabha');
insert into region values ('punjab', 'patiala', 'SAN');
insert into region values ('punjab', 'SAS', 'SAS');
insert into region values ('punjab', 'SAS', 'ZRK');


i added all column area , district, and city in dimension of PIVOT table
in city i have repeated values same reside in DISTRICT

and these repeated/duplicate values cause a bad result in the end.





The city in red box is extra and some time leads to trouble depend upon the expression. In simple words if i say, i dont want to see the repeated city name "patiala" in city column.


Where patiala is act as city as well as district.

We can fix this in two ways:
- at script level
- at chart level

AT SCRIPT level

at script level my table look like this




After modification in script it look like this:


use below script for above approach:
REG:
LOAD AREA,
    DISTRICT,
    if(CITY = DISTRICT,null(),CITY) as CITY;
SQL SELECT *
FROM SCOTT.REGION;



AT CHART level









Friday, October 17, 2014

Use Go inside the Begin end in sqlserver

While i am working on one day having task of add column if not exists in the table, then add it and update it with some values

i used below scripts:

IF NOT EXISTS(SELECT * FROM INFORMATION_SCHEMA.COLUMNS
    WHERE TABLE_NAME = 'EMP' AND COLUMN_NAME = 'SAL_COMM')
BEGIN

ALTER TABLE EMP ADD SAL_COMM int;
UPDATE EMP SET SAL_COMM = coalesce(SAL+COMM,sal,0);
END


Msg 207, Level 16, State 1, Line 6
Invalid column name 'SAL_COMM'.

But error is populated.
WHY:
i above query set we are executing alter and update command in one set. Now what happen at query parse, where it will check for the syntax and other parameters like column table existence.
When parser reach at UPDATE query it fail because "SAL_COMM" not found in the database.

SOLUTION:
Now what we can do,
we now the reason query fail because not existence of column. In that case we need to execute ALTER command before execution of UPDATE command.
This approach we get with using GO (batch seprator, where sql engine execute all query where GO placed), But issue is We can't use GO in BEGIN..END block.

SOOOO WE NEED TO DYNAMIC SQL, where parser get by-pass the UPDATE query.

IF NOT EXISTS(SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'EMP' AND COLUMN_NAME = 'SAL_COMM')
BEGIN
ALTER TABLE EMP ADD SAL_COMM int;
exec sp_executesql N'UPDATE EMP SET SAL_COMM = coalesce(SAL+COMM,sal,0)';
END



Sunday, October 5, 2014

QV17 Subtract number from variable - set analysis expression

some times we need to subtract or add the number from the variable value. So i am posting this how we can subtract the number from set analysis expression in qlikview.

from this post we can learn
- how to use variable in set analysis
- how to subtract number from variable

Let have a look upon the matter
I have emp table and if you see the hiredate year variation from 1980 to 1987 as you can see in your below image.


I also have created calendar as in post 

after creation of calendar we have below data model.



After reload we have added below objects in the existing sheet. 

current selection is 1981 and variable vMaxYear store 1981 value in it. chart have below dimension
DEPTNO
HIREDATE

and expression :
chart1 - Sum ({$(=$(vMaxYear)-1)}>}SAL)
 will show the data of employees whose HIREDATE lie in the year 1980.

chart2 - Sum ({}SAL)
 will show all data, irrespective of any date selection.




Saturday, October 4, 2014

string search in database postgres

I created a search string function in postgres, thru which we can search string in a database table having column datatype 'character', 'character varying', 'text'.

According to the requirement we can add more datatype to vast your search or you can cut down the data-type to narrow your search.

This is one requirement, which is covered.
Now using same function we can learn below topic
  • for loop in postgres
  • dynamic execution in postgres (EXECUTE)
  • return record set using function using SETOF
    •  To achieve this i learn that how to return multiple row (record set) from a function, i.e what i want to share with all of you guy
                    First of all we need to use SETOF with return keyword, second is need to declare a variable having record datatype. third is Need to use RETURN NEXT in for loop - it will return the rows to outer RETURN, which is earlier declared.

drop function seing(char(50))

CREATE OR REPLACE FUNCTION seing(str char(50))
RETURNS setof  character varying AS
$BODY$
DECLARE
    tempCount bigint ;
    record_v record;
    query text;
BEGIN
    query:= 'select table_name ,column_name from information_schema.columns where table_schema = ''public'' and
        data_type in (''character'',''character varying'',''text'')';
    FOR record_v IN EXECUTE query LOOP
        EXECUTE 'select count(*) as a from ' || quote_ident(record_v.table_name) || ' where ' || quote_ident(record_v.column_name ) || ' like ''%'||str ||'%'';' into tempCount;
        --RAISE NOTICE 'loop test .. it is working fine';
        return next record_v.table_name ||'    '   ||record_v.column_name  || '   '||tempcount ;
        --return   record_v.table_name ||' '||record_v.column_name;
    END LOOP;
return;
END;
$BODY$ LANGUAGE plpgsql VOLATILE
COST 100;

select seing('tbsp')



Friday, October 3, 2014

ERROR: cannot change return type of existing function

Error faced
ERROR:  cannot change return type of existing function
HINT:  Use DROP FUNCTION first.



this error generally comes when you trying to recreate the function having its RETURN TYPE value changed.

CREATE OR REPLACE FUNCTION seing(str char(50)) RETURNS TABLE (a varchar, b varchar) AS
$BODY$
DECLARE
.....


ERROR:  cannot change return type of existing function
HINT:  Use DROP FUNCTION first.

********** Error **********

ERROR: cannot change return type of existing function
SQL state: 42P13
Hint: Use DROP FUNCTION first.


Solution
drop function using drop command as below


drop function seing(char(50));

This is worked fine at my end. If you face the above error in future then always drop the function
as above command and
function name should contain the datatype as define in previous function.

Sunday, September 28, 2014

Execute postgres SQL script on cmd

To need restore database using sql file we need to execute , run the sql file with the help of cmd.

open postgres
    create database.
open cmd
    move to the bin folder of postgres
    cd C:\Program Files\PostgreSQL\9.1\bin

    then execte the command
    C:\Program Files\PostgreSQL\9.1\bin>psql -U postgres -d usda -a -f C:\Users\gurjeet\Desktop\usda.sql

CHANGE .KETTLE PATH to directory itself

 open "spoon.bat"

--just down to kettle_home commented add

set KETTLE_HOME=.

:: **************************************************
:: ** Kettle home                                  **
:: **************************************************

if "%KETTLE_DIR%"=="" set KETTLE_DIR=%~dp0
if %KETTLE_DIR:~-1%==\ set KETTLE_DIR=%KETTLE_DIR:~0,-1%

cd %KETTLE_DIR%
set KETTLE_HOME=.
set PENTAHO_JAVA=javaw
set IS64BITJAVA=0

call "%~dp0set-pentaho-env.bat"


(Highlighted in green color) 

Wednesday, September 24, 2014

ORA-03113: end-of-file on communication channel



While i am trying to connect with database, connection fails with below error.





SQL> startup
ORACLE instance started.


Total System Global Area  778387456 bytes
Fixed Size                  1374808 bytes
Variable Size             411043240 bytes
Database Buffers          360710144 bytes
Redo Buffers                5259264 bytes
Database mounted.
ORA-03113: end-of-file on communication channel
Process ID: 6284
Session ID: 9 Serial number: 5






After error i look into the trace file for error and error as below.





*** 2014-09-23 18:36:23.438
*** SESSION ID:(9.5) 2014-09-23 18:36:23.438
*** CLIENT ID:() 2014-09-23 18:36:23.438
*** SERVICE NAME:() 2014-09-23 18:36:23.438
*** MODULE NAME:(sqlplus.exe) 2014-09-23 18:36:23.438
*** ACTION NAME:() 2014-09-23 18:36:23.438
 
ORA-19815: WARNING: db_recovery_file_dest_size of 4039114752 bytes is 100.00% used, and has 0 remaining bytes available.


*** 2014-09-23 18:36:24.438
************************************************************************
You have following choices to free up space from recovery area:
1. Consider changing RMAN RETENTION POLICY. If you are using Data Guard,
   then consider changing RMAN ARCHIVELOG DELETION POLICY.
2. Back up files to tertiary device such as tape using RMAN
   BACKUP RECOVERY AREA command.
3. Add disk space and increase db_recovery_file_dest_size parameter to
   reflect the new space.
4. Delete unnecessary files using RMAN DELETE command. If an operating
   system command was used to delete files, then use RMAN CROSSCHECK and
   DELETE EXPIRED commands.
************************************************************************
ORA-19809: limit exceeded for recovery files
ORA-19804: cannot reclaim 41056768 bytes disk space from 4039114752 limit
*** 2014-09-23 18:36:24.448 4132 krsh.c
ARCH: Error 19809 Creating archive log file to 'E:\APP\****\FLASH_RECOVERY_AREA\ZA1\ARCHIVELOG\2014_09_23\O1_MF_1_133_%U_.ARC'
*** 2014-09-23 18:36:24.448 2747 krsi.c
krsi_dst_fail: dest:1 err:19809 force:0 blast:1
DDE: Problem Key 'ORA 312' was flood controlled (0x1) (no incident)
ORA-00312: online log 1 thread 1: 'E:\APP\****\ORADATA\ZA1\REDO01.LOG'
ORA-16038: log 1 sequence# 133 cannot be archived
ORA-19809: limit exceeded for recovery files
ORA-00312: online log 1 thread 1: 'E:\APP\****\ORADATA\ZA1\REDO01.LOG'


*** 2014-09-23 18:36:24.488
USER (ospid: 6284): terminating the instance due to error 16038






Now connect the user,
Start the database at mount state



Startup mount;


Execute some below set of queries
SQL> show parameter alert
NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
log_checkpoints_to_alert             boolean     FALSE


SQL> show parameter BACKGROUND_DUMP_DEST
NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
background_dump_dest                 string      e:\app\****\diag\rdbms\za1\za1\trace


SQL> show parameter db_recovery_file_dest_size
NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
db_recovery_file_dest_size           big integer 3852M



SQL>  select name,
  2     floor(space_limit/1024/1024) "Size_MB",
  3     ceil(space_used/1024/1024) "Used_MB"
  4     from v$recovery_file_dest
  5     order by name
  6     /
NAME                                        Size_MB    Used_MB
---------------------------------------- ---------- ----------
E:\app\****\flash_recovery_area            3852       3851


 

open CMD>

1. RMAN
2. connect /
3. crosscheck archivelog all;
4. delete archivelog all completed before 'sysdate -7';
5. Prompt for Yes or No
6. Y and enter
7. exit



NAME                                        Size_MB    Used_MB
---------------------------------------- ---------- ----------
E:\app\****\flash_recovery_area            3852         36



Above 3rd and 4th command check for the archieve log file and delete the old archieve log file which is older than the week from current date.
So Now we have free space in "flash_recovery_area" .


Now we can connect the database without any hurdel


 

Wednesday, September 17, 2014

QV16 Create calendar dimension using min max date

In this post i have created the calendar dimension, if we have fact table, from we can get minimum and maximum date. Using these dates we can create calendar dimension which will include quater information , year, month, day, date and others.
 
ODBC CONNECT TO [oracle_scott;DBQ=ORCL ];

Base:
LOAD EMPNO,
    ENAME,
    JOB,
    MGR,
    date(HIREDATE,'DD/MM/YYYY') as HIREDATE,
    SAL,
    COMM,
    DEPTNO;
SQL SELECT *
FROM SCOTT.EMP;


minMaxdate:
Load min(HIREDATE,'DD/MM/YYYY') as minDate, max(HIREDATE,'DD/MM/YYYY') as maxDate Resident Base;

Let vminDate = num(peek('minDate',0,'minDate'));
Let vmaxDate = num(peek('maxDate',0,'maxDate'));

cal1:
load
    IterNo() as num1,
    $(vminDate) + IterNo() - 1 as Num,
    date($(vminDate) + IterNo()-1) as TempDate
AutoGenerate 1 While
$(vminDate)+IterNo()-1 <= $(vmaxDate);

cal2:
load
    Num as DateSeq,
    TempDate as TheDate,
    Month(TempDate) as month,
    num(Month(TempDate)) as MonthSeq,
    Year(TempDate) as yearSeq,
    day(TempDate) as DaySeq
Resident cal1
order by TempDate ASC;

//drop table cal1;

cal3:
LOAD
    DateSeq,
    TheDate as HIREDATE,
    yearSeq,
    month,
    MonthSeq,
    DaySeq,
    MonthSeq + (yearSeq - 1) *12 as MonthSeq1,
    Ceil(MonthSeq/3) as quarter,
    'Q'&     Ceil(MonthSeq/3) as quarter1,
    WeekDay(TheDate) as DayName
Resident cal2
order by TheDate ASC;



Thursday, August 28, 2014

QV15 Hierarchy In QLIKVIEW

QLIKVIEW wow, a great tool again. Solve the big problem of hanling the hierarchical data.
We have two function at all, which can be used at script level
 - hierarchy(upto 8 variables can pass)
 - hierarchybelongsto( 6 variables can pass.)


both function can use in front of LOAD statement or SELECT.

hierarchy:
it convert the node adjacent table to , in that way each level of child record will written in separate field.

Lets do some handsome practice on table - scott.emp !!!!!!!!

create connection
load the table
below the script you direct copy paste to your script editor after adding connection, i have three set of queries we will see the behaviour of each set of queries one by one:

//HIERARChy(EMPNO,MGR,ENAME)
//LOAD EMPNO,
// ENAME,
// JOB,
// MGR,
// SAL;
//SQL SELECT *
//FROM SCOTT.EMP;



 
:::
HIERARChy(EMPNO,MGR,ENAME)
it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.


HIERARChy(EMPNO,MGR,ENAME,asdf,ENAME,[hierarchy_goeg], ';', 'HIERARCHY DEPTH')
LOAD EMPNO
,
ENAME
,
JOB
,
MGR
,
SAL
;
SQL
SELECT *
FROM SCOTT.EMP;





:::
HIERARChy(EMPNO,MGR,ENAME,asdf,ENAME,[hierarchy_goeg], ';', 'HIERARCHY DEPTH')it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.
4. asdf - i use any dummy name .. either we can use the parent record description related field
5. ENAME - PathSource: The path in QlikView is a string containing one field per ancestor down to the node.
6. PathName: the name of the field that will contain the path.
7. Delimiter: the letter to separate the different fields
8. HIERARCHY DEPTH -the name of the field that will contain the depth of the node.





hierarchybelongsto:
The hierarchybelongsto prefix is used to transform a hierarchy table adjacent node table. Adjacent node table is the table where each record corresponds to a node and has a field that contains a reference to the parent node. (means table will store data in parent child relation help of two fields, parent record repeated over row untill all child has been written corespond to parent. see the below image first two field behaviour, it is a adjacent node table)
:::
HierarchyBelongsTo(EMPNO,MGR,ENAME, 'ANCESTORS_KEY','ANCESTORS_NAME', 'Depth')
it has three fields:
1. field1 i.e. empno - that has child records only
2. field2 i.e. mgr - reference to parent records
3. field3 i.e. ename - any column name can be used, depend on us what type of data we want to show in hierarchical manner.
 here i want to see the employee name in hierarchical order.
4. write parent record in field
5. write child record in field
6. Depth - depth number of record (0 for child node, summing up with 1 for roots)


geo2:
HierarchyBelongsTo(EMPNO,MGR,ENAME, 'ANCESTORS_KEY','ANCESTORS_NAME', 'Depth')
LOAD EMPNO
,
ENAME
,
JOB
,
MGR
,
SAL
;
SQL
SELECT *
FROM SCOTT.EMP;





 

Thursday, August 21, 2014

QV14 Peek() used in for loop

Peek() function can also most usable in for loop in qlikview.
Lets have a look on a little example, I have google-ed this example and perform on my local machine.

In the example, we have a table "FileListTable:" with two fields Date1 and FilNme. FilNme i.e. Filename all are exists on the loacal harddisk, we have to load each file and we can perform any type of string operation on loaded file.
In for loop:
1. NoOfRows return total number of rows in table
2. for loop work from bottom to up. As the logic says "vFileNo-1", Total no. of rows-1
 i.e.  11-1 = 10 return -> Airline Operations_ch8.qvw.2014_07_28_12_18_30.log
   10-1 = 9   Airline Operations_ch8.qvw.2014_07_28_11_34_41.log
   9-1 = 8    Airline Operations_ch8.qvw.2014_07_25_16_51_02.log
   ....
   2-1 = 1    Airline Operations_ch8.qvw.2014_07_25_15_38_06.log
   1-1 = 0    Airline Operations_ch8.qvw.2014_07_25_15_36_35.log

3. In loop we are creating New file name which will same the log file in the form of text file.
4. Store all log file in form of "*.txt"







FileListTable:
Load * inline
[
Date1, FilNme
18-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_36_35.log
25-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_38_06.log,
26-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_39_08.log,
27-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_50_00.log,
19-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_50_37.log,
20-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_53_15.log,
21-07-2014, Airline Operations_ch8.qvw.2014_07_25_15_57_28.log,
22-07-2014, Airline Operations_ch8.qvw.2014_07_25_16_48_30.log,
23-07-2014, Airline Operations_ch8.qvw.2014_07_25_16_51_02.log,
24-07-2014, Airline Operations_ch8.qvw.2014_07_28_11_34_41.log,
28-07-2014, Airline Operations_ch8.qvw.2014_07_28_12_18_30.log
]



For vFileNo = 1 to NoOfRows('FileListTable')
      Let vFilNme = Peek('FilNme',vFileNo-1,'FileListTable');
       xyz: Load *,'$(vFilNme)' as FilNme
       From
[$(vFilNme)];      
       let vfilenme =replace(vFilNme,'.log','.txt') ;
       store * from xyz into $(vfilenme);
Next vFileNo

 

QV13 Peek() and previous()

the Peek() function allows the user to look into a field that was not previously loaded into the script
whereas the
Previous() function can only look into a previously loaded field.


Both are Inter Record Functions.These functions are used when a value from previously loaded records of data is needed for the evaluation of the current record.  IT loads the value from currently evaluating column. which is being in process and look into the previous value for that column.

peek(fieldname [ , row [ , tablename ] ] )
Returns the contents of the fieldname in the record specified by row in the internal table tablename. Data are fetched from the associative QlikView database.
Fieldname must be given as a string (single quote).

Row must be an integer.
  • 0 denotes the first record,
  • 1 the second and so on.
  • Negative numbers indicate order from the end of the table. -1 denotes the last record read.
  • If no row is stated, -1 is assumed.
     
If no tablename is stated, the current table is assumed.


previous(expression )
Returns the value of expression using data from the previous input record. In the first record of an internal table the function will return NULL because no previous record exists for first value.

The previous function may be nested.
Data are fetched directly from the input source, making it possible to refer also to fields which have not been loaded into QlikView, i.e. even if they have not been stored in its associative database.

Some similarities and differenceThe Similarities
 - Both allow you to look back at previously loaded rows in a table.
 - Both can be manipulated to look at not only the last row loaded but also previously loaded rows.
The Differences - Previous() operates on the Input to the Load statement, whereas Peek() operates on the Output of the Load statement. (Same as the difference between RecNo() and RowNo().) This means that the two functions will behave differently if you have a Where-clause.
 - The Peek() function can easily reference any previously loaded row in the table using the row number in the function  e.g. Peek(‘Employee Count’, 0)  loads the first row. Using the minus sign references from the last row up. e.g. Peek(‘Employee Count’, -1)  loads the last row. If no row is specified, the last row (-1) is assumed.  The Previous() function needs to be nested in order to reference any rows other than the previous row e.g. Previous(Previous(Hires))  looks at the second to last row loaded before the current row.

So, when is it best to use each function?    The previous() and peek() functions could be used when a user needs to show the current value versus the previous value of a field that was loaded from the original file. 
    The peek() function would be better suited when the user is targeting either a field that has not been previously loaded into the table or if the user needs to target a specific row.

source http://community.qlik.com/blogs/qlikviewdesignblog/2013/04/08/peek-vs-previous-when-to-use-each

//example1

DR:
LOAD F1, F2, Peek(F1) as PeekVal, Peek('F1',1,'DR') as PeekVal1, Previous(F2) as PrevVal ;
// Where F2 >= 200;
LOAD * INLINE
[
F1, F2
A, 100
B, 200
C, 150
D, 320
E, 222
F, 903
G, 666
]
;



//example2

data of file "9_Peek vs Previous.xlsx"
Date Hired Terminated
1/1/2011 6 0
2/1/2011 4 2
3/1/2011 6 1
4/1/2011 5 2
5/1/2011 3 2
6/1/2011 4 1
7/1/2011 6 2
8/1/2011 4 1
9/1/2011 4 0
10/1/2011 1 0
11/1/2011 1 0
12/1/2011 3 2
1/1/2012 1 1
2/1/2012 1 0
3/1/2012 3 1
4/1/2012 3 4
5/1/2012 2 1
6/1/2012 1 0
7/1/2012 1 0
8/1/2012 3 0
9/1/2012 4 3
10/1/2012 0 3
11/1/2012 2 1
12/1/2012 0 0
1/1/2013 4 2
2/1/2013 2 2
3/1/2013 3 0


[Employees Init]:


LOAD
rowno() as Row
,
Date(Date) as Date
,
Hired
,
Terminated
,
If(rowno()=1, Hired-Terminated, peek([Employee Count], -1)+(Hired-Terminated)) as [Employee Count]

From
[..\datafiles\9_Peek vs Previous.xlsx]
(ooxml, embedded labels, table is
Sheet2);

[Employee Count]:
LOAD

Row
,
Date
,
Hired
,
Terminated
,
[Employee Count]
,
If(rowno()=1,0,[Employee Count]-Previous([Employee Count])) as [Employee Var]

Resident [Employees Init] Order By Row asc;

Drop Table
[Employees Init];




image explanation
1. hierd - no of employee hiered
2. Terminated - no. of emp terminated
3. [Employee count] - accumalation(no of working employee - terminated employees)
we used peek function to get the desire result
4. [Employee Var] - employee variation from field [Employee count] using previou().

//now if you observe that we use previous() in different load query, because previous work on pre-loaded field.
//If we try to use previous() in load table "[Employees Init]" then it will not work, because "[Employee Count]" is being calculated in "[Employee Count]" load statement.

Wednesday, August 20, 2014

QV12 Above() Below() function in qlikview

above and below lie in the Chart Inter Record Functions category. These functions can only be used in chart expressions.

SYNTAX is:
above([ total ] expression [ , offset [,n ]])

It return the value from expression, which is previously seen in the execution.
In other words it return the value the row appear above the current row in table chart.

where we can use above function
- need to create pivot table [accumalation of pivot table]
- need to get the difference and addition of sales (diff. according to month year any perspective) i.e. accumalative sum
Below is the example with some sample data and we will see the behaviour of above() function in straigth table as well as in pivot table.

ex:
LOAD * INLINE [
sno, no
1,20
2,578
3,344
4,54
5,34
6,34
7,34
8,34
]
;

In chart properties add the below expressions:
- sum(no)
- above(no) ~~ above(no,1)
- below(no,1)
- sum(aggr(rangesum(above(total no)),sno))

Reload the document and see the behaviour of table chart.


image



picturization:
1. sum(no) - will give total sumup of expression
2. above(no) ~~ above(no,1)
as you can see i have written above(no) ,above(no,1) two different entities and but working is same for both. If we skip the offset by default qlikview assume 1.
And see the image above the behaviour is as it start write value from 2 row, by skipping 1, because offset number is one. If we replace it with two it will start writing from 3rd row.

- and you can see the first row having null value, because it there is no value above 20.
3. below(no,1) - It start writing in the column the value below in current expression.

Now we can see the image that the column above(no) and below(no) not summing up the column.
4. to get the sum for the column we need to do "rangesum" with "aggr()" function.



Above() behaviour with pivot table:
script the below inline table

abve:
LOAD * INLINE
[
Customer, FK, Sales
A, 1, 50
A, 2, 45
A, 3, 25
A, 4, 75
B, 1, 35
B, 2, 75
B, 3, 12
B, 4, 14
C, 1, 12
C, 2, 50
C, 3, 45
C, 4, 65
D, 5, 45
D, 6, 76
D, 7, 12
E, 1, 12
E, 2, 15
F, 1, 25
F, 2, 56
F, 3, 87
F, 4, 98
F, 5, 54
G, 1, 12
G, 2, 34
G, 3, 15
H, 1, 15
H, 2, 65
H, 3, 15
H, 4, 12
]
;


LOAD * INLINE
[
TestVal
1
2
5
6
7
]
;

Dimension: - Customers, FK
Expressions:
- Sum (Sales)
pivot_column - =above( column(1))
total_sum - =Above(TOTAL Column(1)) + If(Above(TOTAL Column(3)) > 0, Above(TOTAL Column(3)), 0)

sum customer wise - rangesum(ABOVE(column(1), 1, ROWNO()))

reload and create a pivot table chart.




Tuesday, August 19, 2014

QV11 Only() function in qlikview

Only function work return the only one value when the value in parameter in common with respect to dimension else it will return null. It always return single value. 

pricem:
Load * inline [
Month, Price
1, 345
1, 342
1, 34
1, 345
1, 67
2, 76
2, 87
2, 67
2, 33
3, 321
3, 123
3, 324
3, 274
4, 33
4, 49
4, 192
5, 923
5, 347
5, 43
5, 84
5, 52
5, 55
6, 10
6, 12
6, 554
6, 546
7, 22
7, 10
7, 20
8, 30
8, 40
9, 50
10, 6
10, 60
10, 70
11, 80
12, 90
12, 90
]
;


load Month, Only(Price) Resident pricem group by Month
;


Using above query it will return value for 9,11,12 month only


This is where the Only() function affects the calculation without the user knowing it; if there is no explicit aggregation function in the expression, QlikView uses the Only() function implicitly. Hence, in some cases,

- “Only(Date)” is used as sort expression
- “=Only(Product)='Shoe' ” is used as search criterion.

case will be updated soon

QV10 Cross table

What is cross tables:
Cross tables are like that any 2-D sheet can be excel sheet or any feedback form. Where facts are divided on the basis of dimensions.

good example for this:
- sales record in excel sheet, month column is split over the column.
- any feedback form month wise arrange in a file. [feedback form for quality work upon various categories.]

Dire: LOAD Salesman,
Year
,
Jan
,
Feb
,
Mar
,
Apr
,
May
,
Jun

FROM
[..\datafiles\7_cross_table.xls]
(
biff, embedded labels, table is Sheet4$);

 
(example of cross table)

Why we use cross-table:
In qlikview if you have face the table like above, and if you trying to create the annual report on the basis of year which will act as a dimension.

So to create a meaningful report we need to denormalize  [means need to move column to rows]  the table, which is also called cross-table in qlikview.

For above screen shot , calculative data is split over the column which can-not be easily sumable w.r.t. dimension.





(We need to use "cross table " keyword to change the table orientation)
 
 
Dire:
CrossTable(category, Data,2)
LOAD Salesman,
Year,
Jan,
Feb,
Mar,
Apr,
May,
Jun
FROM
[..\datafiles\7_cross_table.xls]
(biff, embedded labels, table is Sheet4$);

 *where - category is column name for jan-june columns

Below is chart created with cross table "Dire"




Friday, August 8, 2014

QV9 Dynamic Update

Again a great functionality in qlikview is Dynamic update. Through this we can write back to the dataset of qlikview. The written changes can be reflected in qlikview data model. Logically later we can write these changes can be written to qvd or xls or any other source file.
Below is my report UI


sub Insert
 set Result = ActiveDocument.DynamicUpdateCommand("INSERT INTO TEST  (SRNO,TYPE) VALUES (4,A)")
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub Update
 SQL_String="UPDATE TEST SET SR = 2 WHERE SRNO=3;"
  SET Result = ActiveDocument.DynamicUpdateCommand (SQL_String)
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub Delete
 SQL_String="Delete from TEST where SRNO =4;"
  SET Result = ActiveDocument.DynamicUpdateCommand (SQL_String)
  if Result = false then
    MsgBox Result.ErrorMessage
  end if 
end sub

sub SaveTableToFile
set obj = ActiveDocument.GetSheetObject("CH01")
obj.ExportEx "..\datafiles\6_dynamic_update.qvd", 4
end sub


INSERT INTO TEST (SRNO,TYPE) VALUES ($(vCnt), A $(vCnt))
UPDATE TEST SET CDATE = TODAY() WHERE SRNO=$(vCnt);
DELETE FROM TEST WHERE SRNO  = $(vCnt)

--script editor code copy and paste in below tab
// execute only once then comment it
//LOAD SRNO,
//    CDATE,
//    TYPE,SR;
//SQL SELECT *
//FROM SCOTT.TEST;
//
//
//store TEST into ..\datafiles\6_dynamic_update.qvd;

TEST:
LOAD SRNO,
     CDATE,
     TYPE,
     SR
FROM
[..\datafiles\6_dynamic_update.qvd]
(qvd);

set vCnt = 1;
 

Thursday, August 7, 2014

QV8 Partial Reload

Partial loading - as name specify the partial load instead of doing incomplete load, we will do incomplete load. Some times happen that we made changes to the database. The changes in the form of
- Create a new table
- New attributes added in the table
Now we want that table and new fields should reflect in qlikview datamodel. So Generally we do - we do reload off document.
But it is not a good deal if reload completes in hours. It will waste all the time.
So to save the time QlikView provide the partial load - Instead of reload we will load only those tables which had been modified in the database. This approach is called partial reload.
Advantage-
- While partial reload all other load statement and scripts remain untouched.
 Means partial will execute only those command where it find "ADD", "REPLACE" as a prefix to LOAD statement.

How partial Reload can be intialize
- ctrl+shift+r
- File > partial reload.

Two Approaches are there:
- use "ADD", "REPLACE" in front of LOAD statement (partial reload and reload both can execute this statement)
- We can also use "ADD ONLY", "REPLACE ONLY" instead of "ADD", "REPLACE"
 By adding "ONLY" , then the command will execute only and only when we do partial Reload of document. Reloading of document will not affect these Load statements.

- AND we can use IsPartialReload() function:
 it will return -1 --- when we do partial reload  (ctrl+shift+r)
   return 0 ---- when we do reload (ctrl+r)

  
--With partialreload function .. put below code in script editor  
if (IsPartialReload() = -1 ) then
add LOAD GRADE,
    LOSAL,
    HISAL;
SQL SELECT *
FROM SCOTT.SALGRADE;
end if;


Above code will be execute only and only if we do partial reload.


-- With and ADD and REPLACE keyword
//below commented statement is load once with (ctrl +R), it load 3 column in data model.
//dasf:
//replace
//LOAD SRNO,
//    CDATE,
//   TYPE;
//   SQL SELECT *
//FROM SCOTT."TEST";


dasf:
replace only
LOAD SRNO,
    CDATE,
   TYPE ,SR;
   SQL SELECT *
FROM SCOTT."TEST";


fligh:
add only
LOAD "PRIMARY_KEY",
    "CREATED_DATE",
    "UPDATED_DATE",
    YEAR,
    MONTH,
    "AIRCRAFT_TYPE_ID",
    "DEPARTURES_SCHEDULED",
    "DEPARTURES_PERFORMED",
    "AVAILABLE_SEATS",
    "TRANSPORTED_PASSENGERS",
    "TRANSPORTED_FREIGHT";
SQL SELECT *
FROM SCOTT."FLIGHT_DATA_1";


- Now save it and do partial reload (ctrl+shift+r)

It is suggest to check the Data model along with every reload.

QV7 Generic load in qlikview

A generic table is commonly used to store attribute values for different objects.
eg. ball and box is the example

These attributes are not necessarily shared across all objects. Objects might have common field or not.
Let have we have two objects ball and box
ball attribute - color , weigth , diameter
box attribute - color , weigth , length, heigth, width

Here we can we color and weigth are common field between two objects. And the attributes and its values store in a single table as below.




Enter the Generic prefix.
It converts the data to a structure where each attribute is placed in a field of its own (each attribute now act as a an object entity). Another way to express it is to say that it takes field values and converts these to field names. If you compare it to the Crosstable prefix, you will find that they in principle are each other’s inverses.


The syntax is
   Generic Load Key, Attribute, Value From … ;


There are however a couple of things worth noting:
    Usually the input data has three columns: one qualifier field (Key in the above example), an Attribute and a Value. But you may also have several qualifying fields. If you have four or more columns, all columns except the two last will be treated as qualifying fields.
    The Generic prefix will create several tables; one table per attribute. This is normally not a problem. Rather, it is an advantage: It is the least memory-consuming way to store data if you have many attributes.


Now open qlikview and create new document save it. and open script editor
// generic load with one consolidated resulting table:
InputTable:
LOAD * INLINE [
object,attribute,value
ball,color,red
ball,diameter,10 cm
ball,weight,100 g
box,color,black
box,height,16 cm
box,length,20 cm
box,weight,500 g
box,width,10 cm
];
 

GenTable:
Generic Load object, attribute, value Resident InputTable;
 



 
after reload check data model


Now as you can see in image, every attribute act as its own object entity.

after search alot on google i have found that the use of generic load is very rare. It can be use only when we want to change the orientation of the table.





Paste below code in same script editor to made orientation in table.
ResultTable:
LOAD Distinct object Resident InputTable;
 

FOR i = 0 to NoOfTables()
 TableList:
 LOAD
  TableName($(i)) as Tablename AUTOGENERATE 1
 WHERE
  WildMatch(TableName($(i)), 'GenTable.*');
NEXT i


FOR i = 1 to FieldValueCount('Tablename')
 LET vTable = FieldValue('Tablename', $(i));
 LEFT JOIN (ResultTable) LOAD * RESIDENT [$(vTable)];
 DROP TABLE [$(vTable)];
NEXT i

source -
http://community.qlik.com/blogs/qlikviewdesignblog/2014/03/31/generic
web stats