As the year comes to an end, wanted to take a moment to thank you all for keeping this blog going.
When I started this, didn't think that it would last for more than a few months or I would even have enough data to write about for more than 10 articles..It is with your continued interest and insights that I have been able to keep this going.
Wishing you all a Happy and Prosperous New Year...Happy Holidays!!!
Cheers,
Zephyr
Popular Posts
Friday, December 24, 2010
Thursday, December 16, 2010
Calculations in Dimensional Query Subjects
A long time back while working on a relational ad-hoc package request I ran into an issue. I have been unable to explain the cause for it and the solution involved a work-aound that I am unhappy with. I am posting it here hoping that someone might be able to offer a better solution or even validate my reasoning on the issue.
Let us assume the ad-hoc package involves the query subjects - Product, Revenue and Forecast. Product behaves like a dimension due to being on the 1 side of the 1-N relationship to the other 2 query subjects. I have created a calculation say Calc A in the Product Query subject. The package also has a stand-alone calculation say Calc B defined as Revenue / calc A.
In Query Studio, when you drag Product and Calc B the data is off. The SQL generated by Cognos involves a min on Calc A even if aggregates are set to Total or Sum.Including the function Total inside of Calc A also doesn't help. No determinants are involved as the fact tables are at the same granularity and join to the dimensions at the same granularity.
My assumption on this issue now is that Cognos treats Product as a Dimension and hence includes a Min on the dimensional data item.
My work-around to the above issue was to create a shortcut to Product and join it to Product with a cardinality to make this behave like a fact. This has the downside of the table being queried twice and involves a stitched query when calc B is pulled into the query. Since the Dimension table in my case was not huge there wasn't much performance impact.
I will otherwise have to manipulate cardinalities to make Cognos treat Product as a fact which is not advisable as it impacts other queries and still will not remove the stitched query. Any thoughts on this?
Let us assume the ad-hoc package involves the query subjects - Product, Revenue and Forecast. Product behaves like a dimension due to being on the 1 side of the 1-N relationship to the other 2 query subjects. I have created a calculation say Calc A in the Product Query subject. The package also has a stand-alone calculation say Calc B defined as Revenue / calc A.
In Query Studio, when you drag Product and Calc B the data is off. The SQL generated by Cognos involves a min on Calc A even if aggregates are set to Total or Sum.Including the function Total inside of Calc A also doesn't help. No determinants are involved as the fact tables are at the same granularity and join to the dimensions at the same granularity.
My assumption on this issue now is that Cognos treats Product as a Dimension and hence includes a Min on the dimensional data item.
My work-around to the above issue was to create a shortcut to Product and join it to Product with a cardinality to make this behave like a fact. This has the downside of the table being queried twice and involves a stitched query when calc B is pulled into the query. Since the Dimension table in my case was not huge there wasn't much performance impact.
I will otherwise have to manipulate cardinalities to make Cognos treat Product as a fact which is not advisable as it impacts other queries and still will not remove the stitched query. Any thoughts on this?
Tuesday, December 7, 2010
Column Pagination
This tip is something probably everyone is aware of. When dealing with wide reports, you can split the report horizontally across pages. You can set which columns need to appear across each of those horizontal pages using the pagination property of the List Column.
Moving Aggregate Calculations on Dimensional Model
Here's a requirement to use moving aggregate calculations against a dimensional model and the solution to the same. The requirement is based on dates and hence may not be applicable to all scenarios.
Requirement: Display Quantity Sold, Revenue and Margin which is calculated as Revenue / Quantity for each year in a crosstab. Show the Margin Differential or moving-difference Calculation which is the (current year quantity - previous year quantity) / Margin of previous year.
Solution:
Step 1: Create a crosstab report with Years, Quantity Sold, Revenue dragged into it.
Step 2: Create a calculation to get Previous Year - lag(currentmember([Great Outdoors].[Years].[Years]),1). Let us call this as Prev Year.
Step 3: Create a Calculation called Quantity Diff - [Quantity Sold] - aggregate([Quantity Sold] with set [Prev Year])
Step 4: Create the Margin Diff Calc as [Qty Diff] / aggregate([Margin] within set [Prev Year]).
User Comments [Paul]: You could use the periodsToDate function for a running - aggregate([Quantity old] within set periodsToDate([All Member],currentMember([Great Outdoors].[Years].[Years])))
Requirement: Display Quantity Sold, Revenue and Margin which is calculated as Revenue / Quantity for each year in a crosstab. Show the Margin Differential or moving-difference Calculation which is the (current year quantity - previous year quantity) / Margin of previous year.
Solution:
Step 1: Create a crosstab report with Years, Quantity Sold, Revenue dragged into it.
Step 2: Create a calculation to get Previous Year - lag(currentmember([Great Outdoors].[Years].[Years]),1). Let us call this as Prev Year.
Step 3: Create a Calculation called Quantity Diff - [Quantity Sold] - aggregate([Quantity Sold] with set [Prev Year])
Step 4: Create the Margin Diff Calc as [Qty Diff] / aggregate([Margin] within set [Prev Year]).
User Comments [Paul]: You could use the periodsToDate function for a running - aggregate([Quantity old] within set periodsToDate([All Member],currentMember([Great Outdoors].[Years].[Years])))
Wednesday, December 1, 2010
Creating Effective Dashboards
Here's a look at how to create effective Dashboards using Portal feature of Cognos.
Requirement: Create a dashboard showing Product Name, Sales in a list and chart report as tabbed reports with a prompt on Product ID.
Solution:
Step 1: Create folders - Dashboard Reports, Dashboard Portals.
Step 2: Create the prompt report under the Dashboard Reports folder. Let us name this as Dashboard - Prompt. Create an optional Product value prompt.
Step 3: Create a list report and a chart report under the Dashboard Reports folder. Let us name this as Dashboard - List, Dashboard - Chart. Include optional Produt filters in both the reports.
Step 4: Create a Dashboard Portal page for each of the reports under the Dashboard Portal folder. Name these as Dashboard - List Portal Page and Dashboard - Chart Portal Page. Set the content to Cognos Viewer.
Step 5: Edit the Portal Pages and set the Entry to the Dashboard Reports.Hide the title bars in the portal pages.
Step 6: Set the Portlet Communication options under Report Properties for each of the Dasboard portal pages. Provide the channel name for this property. All portlets sharing the same channel name will then be able to communicate with each other and can pass parameter values between them.
Step 7: Create the main Portal page (Dashboard) that integrates the prompt report and the reports portal pages.Set the content to use Cognos Viewer for the prompt section and multipage object for the tabbed report output.
Step 8: For the prompt report section, set the Portlet Communication options under Report Properties.
Requirement: Create a dashboard showing Product Name, Sales in a list and chart report as tabbed reports with a prompt on Product ID.
Solution:
Step 1: Create folders - Dashboard Reports, Dashboard Portals.
Step 2: Create the prompt report under the Dashboard Reports folder. Let us name this as Dashboard - Prompt. Create an optional Product value prompt.
Step 3: Create a list report and a chart report under the Dashboard Reports folder. Let us name this as Dashboard - List, Dashboard - Chart. Include optional Produt filters in both the reports.
Step 4: Create a Dashboard Portal page for each of the reports under the Dashboard Portal folder. Name these as Dashboard - List Portal Page and Dashboard - Chart Portal Page. Set the content to Cognos Viewer.
Step 5: Edit the Portal Pages and set the Entry to the Dashboard Reports.Hide the title bars in the portal pages.
Step 6: Set the Portlet Communication options under Report Properties for each of the Dasboard portal pages. Provide the channel name for this property. All portlets sharing the same channel name will then be able to communicate with each other and can pass parameter values between them.
Step 7: Create the main Portal page (Dashboard) that integrates the prompt report and the reports portal pages.Set the content to use Cognos Viewer for the prompt section and multipage object for the tabbed report output.
Step 8: For the prompt report section, set the Portlet Communication options under Report Properties.
Tuesday, November 23, 2010
Auto Correct option in Report Studio
I came across this option in the Tools menu of Report Studio and found it really useful.
When you run into validation errors while working on complex reports that involve a lot of data items and query objects, identifying the data item that is causing the error becomes difficult especially if its a complex syntax that is used across the data items. In such cases click on the Auto Correct option under Tools Menu to quickly identify all data items that have errors. This is much easier than going through the validation errors and trying to figure out the data items that have issues. But do not click on the OK button of the dialog box as this would remove the erroneous data items from the report.
Lets say we have a report that displays Product Line, Revenue, Margin, Quantity. Now the requirement is to display the values in the records only if quantity is more than 10000 else the records should display NA. So we include a case statement in all the metrics as "Case when [Sales(query)].[Sales].[Quantity] > 10000 then..". Now lets assume we made a mistake while copy-pasting this expression across the data items.Validating the report would throw the following error:
From the error, its difficult to make out the data item that caused this error. Lets select Auto Correct option from the Tools menu:
Though this feature is useful it has some limitations like the dialog box only displays the data item name. So if there are multiple data items across query objects with the same name its difficult to identify the query that has the error.
When you run into validation errors while working on complex reports that involve a lot of data items and query objects, identifying the data item that is causing the error becomes difficult especially if its a complex syntax that is used across the data items. In such cases click on the Auto Correct option under Tools Menu to quickly identify all data items that have errors. This is much easier than going through the validation errors and trying to figure out the data items that have issues. But do not click on the OK button of the dialog box as this would remove the erroneous data items from the report.
Lets say we have a report that displays Product Line, Revenue, Margin, Quantity. Now the requirement is to display the values in the records only if quantity is more than 10000 else the records should display NA. So we include a case statement in all the metrics as "Case when [Sales(query)].[Sales].[Quantity] > 10000 then..". Now lets assume we made a mistake while copy-pasting this expression across the data items.Validating the report would throw the following error:
From the error, its difficult to make out the data item that caused this error. Lets select Auto Correct option from the Tools menu:
Though this feature is useful it has some limitations like the dialog box only displays the data item name. So if there are multiple data items across query objects with the same name its difficult to identify the query that has the error.
Monday, November 15, 2010
Parameterized URLs for Retrieving Burst Outputs
Many of you must be aware of using URLs for inserting Cognos content in external applications. How do you frame the URL for retrieving bursted report outputs?
To retrieve burst output of the report view TEST stored in the folder TEST under Public Folders with burst key 123:
http://< cognos environment>?b_action=cognosViewer&ui.action=view&ui.object=/content/folder[@name='TEST']/reportView[@name='TEST REPORT']/reportVersion[starts-with(@name,'20')]/output[contains(@burstKey,'123')]
Note that the above URL will look through all burst outputs generated in the year starting with "20" i.e. 2000 - 2999.
You can use starts-with or contains for the reportVersion and output options.
Example: We have a report Burst PoC with Burst outputs.
The URL to retrieve the burst output for Americas would be:
Related Articles:
To retrieve burst output of the report view TEST stored in the folder TEST under Public Folders with burst key 123:
http://< cognos environment>?b_action=cognosViewer&ui.action=view&ui.object=/content/folder[@name='TEST']/reportView[@name='TEST REPORT']/reportVersion[starts-with(@name,'20')]/output[contains(@burstKey,'123')]
Note that the above URL will look through all burst outputs generated in the year starting with "20" i.e. 2000 - 2999.
You can use starts-with or contains for the reportVersion and output options.
Example: We have a report Burst PoC with Burst outputs.
The URL to retrieve the burst output for Americas would be:
Related Articles:
Wednesday, November 10, 2010
Teradata Unions - Numeric Overflow Error
Requirement: There are 2 tables Sales and Revenue with columns Date, Product, Sales and Date Product, Revenue. Union data from these 2 tables and display Sales / Revenue value.
Create 2 query objects with the data items Date, Product, Sales and Revenue with Sales set to 0 in query 1 and Revenue set to 0 in query 2. You may run into numeric overflow errors with Teradata cause in Query 1 Sales is set to 0. So Teradata considers data type of Sales to be integer rather than decimal thus causing the error. Cast the 0's to appropriate data types and the issue gets resolved.
Create 2 query objects with the data items Date, Product, Sales and Revenue with Sales set to 0 in query 1 and Revenue set to 0 in query 2. You may run into numeric overflow errors with Teradata cause in Query 1 Sales is set to 0. So Teradata considers data type of Sales to be integer rather than decimal thus causing the error. Cast the 0's to appropriate data types and the issue gets resolved.
Wednesday, November 3, 2010
Avoiding Meta data Callbacks due to Hard-Coded SQLs
You would probably have noticed that Cognos creates meta data call-backs when you have reports based on Query Subjects that have SQLs hard-coded in them. To avoid such call-backs, import all the tables referenced by your query and leave them untouched. Cognos will reference the meta data from these tables while generating the SQL hard-coded in the query subject.
I would suggest it best to avoid hard-coding SQLs in query subjects but instead to create them as views in the DB and reference them through Cognos. This would help ensure better maintenance of objects and in easier impact-analysis of DB changes. Hard-Coded SQLs in query subjects should only be used when dynamicity is required through the usage of Cognos macros in the queries.
I would suggest it best to avoid hard-coding SQLs in query subjects but instead to create them as views in the DB and reference them through Cognos. This would help ensure better maintenance of objects and in easier impact-analysis of DB changes. Hard-Coded SQLs in query subjects should only be used when dynamicity is required through the usage of Cognos macros in the queries.
Thursday, October 14, 2010
Macro Prompts in FM Model for Adhoc Packages
Working on creating an Adhoc Package for use in Query Studio. The package needs to include base metrics and compound metrics and users need to be prompted for Date range and other Dimensions on pulling any of the metrics.
For the prompts, I included filters in the model query subject and built of the metrics. But what I noticed was that, only the base metrics prompt the users and if you drag a compound metric you are not prompted the first time in Query Studio until you hit the re-run button.
Work-around is to use macro filters rather than ?parameter? filters.
.
For the prompts, I included filters in the model query subject and built of the metrics. But what I noticed was that, only the base metrics prompt the users and if you drag a compound metric you are not prompted the first time in Query Studio until you hit the re-run button.
Work-around is to use macro filters rather than ?parameter? filters.
.
Tuesday, October 5, 2010
FM and Teradata Stored Procedures
I have been working on using Stored Procedures to update tables through Event Studio based on a condition. The back end DB is Teradata.
With Teradata if you have a separate schema to create Stored Procedures, the schema will not show up in FM when you use the run meta data wizard unless it has at least 1 table or view. To overcome this issue, a dummy table was created in the schema after which the schema showed up in FM. I then imported the SP. The SP is a data modification SP. Teradata SPs cannot be run through FM or through Event Studio because Cognos fires a "CALL SPName" statement rather than an "EXEC SPName" statement.
To work around the above issue create macros and use them through FM or Event Studio. This is only for Teradata. Oracle Stored Procedures do not have any issues.
With Teradata if you have a separate schema to create Stored Procedures, the schema will not show up in FM when you use the run meta data wizard unless it has at least 1 table or view. To overcome this issue, a dummy table was created in the schema after which the schema showed up in FM. I then imported the SP. The SP is a data modification SP. Teradata SPs cannot be run through FM or through Event Studio because Cognos fires a "CALL SPName" statement rather than an "EXEC SPName" statement.
To work around the above issue create macros and use them through FM or Event Studio. This is only for Teradata. Oracle Stored Procedures do not have any issues.
Tuesday, September 28, 2010
Creating a "Top 10 + Others" dimensional report
Couldn't figure out a better title :-( .
The requirement is to display Top x Products and group all the other products into the "Others" group. If you were using a relational model, you could rank the Products based on Revenue and then use a case statement to identify the Products that need to go into the Others group. But with a dimensional model you would run into issues using a case statement or Rank function.
Steps:
Step 1: Create a Top Prod data item to order the Products based on Revenue and to retrieve the Top 2 Products.
head(order([Sales and Marketing].[Products].[Products].[Product line],[Revenue],desc ),2)
Step 2: Create a Bottom Prod data item to retrieve all the other Products.
except([Sales and Marketing].[Products].[Products].[Product line],[Top Prod])
Step 3: Drag the Revenue data item. Create a Bottom Value data item to calculate the Revenue for "Others" category.
total(currentMeasure within set [Bottom Prod])
Step 4: Create a member for Others that has the Bottom value associated with the Products Dimension.
member([Bottom Value],'Others','Others',[Sales and Marketing].[Products].[Products])
Step 5: Create a Products data item that now includes Top 2 Products and Others.
union([Top Prod],[Others])
Report XML 10.1.1 against Go Sales:
The requirement is to display Top x Products and group all the other products into the "Others" group. If you were using a relational model, you could rank the Products based on Revenue and then use a case statement to identify the Products that need to go into the Others group. But with a dimensional model you would run into issues using a case statement or Rank function.
Steps:
Step 1: Create a Top Prod data item to order the Products based on Revenue and to retrieve the Top 2 Products.
head(order([Sales and Marketing].[Products].[Products].[Product line],[Revenue],desc ),2)
Step 2: Create a Bottom Prod data item to retrieve all the other Products.
except([Sales and Marketing].[Products].[Products].[Product line],[Top Prod])
Step 3: Drag the Revenue data item. Create a Bottom Value data item to calculate the Revenue for "Others" category.
total(currentMeasure within set [Bottom Prod])
Step 4: Create a member for Others that has the Bottom value associated with the Products Dimension.
member([Bottom Value],'Others','Others',[Sales and Marketing].[Products].[Products])
Step 5: Create a Products data item that now includes Top 2 Products and Others.
union([Top Prod],[Others])
Report XML 10.1.1 against Go Sales:
<report xmlns="http://developer.cognos.com/schemas/report/8.0/" useStyleVersion="10" expressionLocale="en-us"> <modelPath>/content/folder[@name='Samples']/folder[@name='Models']/package[@name='GO Sales (analysis)']/model[@name='model']</modelPath> <drillBehavior modelBasedDrillThru="true"/> <queries> <query name="Query1"> <source> <model/> </source> <selection><dataItem name="Top Prods"><expression>head(order([Sales (analysis)].[Products].[Products].[Product line],[Revenue],desc),2)</expression></dataItem><dataItem name="Bottom Prod"><expression>except([Sales (analysis)].[Products].[Products].[Product line],[Top Prods])</expression></dataItem><dataItem name="Bottom Prod value"><expression>total(currentMeasure within set [Bottom Prod]) </expression></dataItem><dataItem name="Others"><expression>member([Bottom Prod value],'Others','Others',[Sales (analysis)].[Products].[Products]) </expression></dataItem><dataItem name="Products"><expression>union([Top Prods],[Others])</expression></dataItem><dataItemMeasure name="Revenue"><dmMember><MUN>[Sales (analysis)].[Sales].[Revenue]</MUN><itemCaption>Revenue</itemCaption></dmMember><dmDimension><DUN>[Sales (analysis)].[Sales]</DUN><itemCaption>Sales</itemCaption></dmDimension><XMLAttributes><XMLAttribute name="RS_dataType" value="2" output="no"/></XMLAttributes></dataItemMeasure></selection> </query> </queries> <layouts> <layout> <reportPages> <page name="Page1"><style><defaultStyles><defaultStyle refStyle="pg"/></defaultStyles></style> <pageBody><style><defaultStyles><defaultStyle refStyle="pb"/></defaultStyles></style> <contents> <crosstab refQuery="Query1" horizontalPagination="true" name="Crosstab1"> <noDataHandler> <contents> <block> <contents> <textItem> <dataSource> <staticValue>No Data Available</staticValue> </dataSource> <style> <CSS value="padding:10px 18px;"/> </style> </textItem> </contents> </block> </contents> </noDataHandler> <style> <defaultStyles> <defaultStyle refStyle="xt"/> </defaultStyles> <CSS value="border-collapse:collapse"/> </style> <crosstabFactCell><contents><textItem><dataSource><cellValue/></dataSource></textItem></contents><style><defaultStyles><defaultStyle refStyle="mv"/></defaultStyles></style></crosstabFactCell><crosstabRows><crosstabNode><crosstabNodeMembers><crosstabNodeMember refDataItem="Products" edgeLocation="e1"><style><defaultStyles><defaultStyle refStyle="ml"/></defaultStyles></style><contents><textItem><dataSource><memberCaption/></dataSource></textItem></contents></crosstabNodeMember></crosstabNodeMembers></crosstabNode></crosstabRows><crosstabColumns><crosstabNode><crosstabNodeMembers><crosstabNodeMember refDataItem="Revenue" edgeLocation="e2"><style><defaultStyles><defaultStyle refStyle="ml"/></defaultStyles></style><contents><textItem><dataSource><memberCaption/></dataSource></textItem></contents></crosstabNodeMember></crosstabNodeMembers></crosstabNode></crosstabColumns></crosstab> </contents> </pageBody> <pageHeader> <contents> <block><style><defaultStyles><defaultStyle refStyle="ta"/></defaultStyles></style> <contents> <textItem><style><defaultStyles><defaultStyle refStyle="tt"/></defaultStyles></style> <dataSource> <staticValue/> </dataSource> </textItem> </contents> </block> </contents> <style> <defaultStyles> <defaultStyle refStyle="ph"/> </defaultStyles> <CSS value="padding-bottom:10px"/> </style> </pageHeader> <pageFooter> <contents> <table> <tableRows> <tableRow> <tableCells> <tableCell> <contents> <date> <style> <dataFormat> <dateFormat/> </dataFormat> </style> </date> </contents> <style> <CSS value="vertical-align:top;text-align:left;width:25%"/> </style> </tableCell> <tableCell> <contents> <pageNumber/> </contents> <style> <CSS value="vertical-align:top;text-align:center;width:50%"/> </style> </tableCell> <tableCell> <contents> <time> <style> <dataFormat> <timeFormat/> </dataFormat> </style> </time> </contents> <style> <CSS value="vertical-align:top;text-align:right;width:25%"/> </style> </tableCell> </tableCells> </tableRow> </tableRows> <style> <defaultStyles> <defaultStyle refStyle="tb"/> </defaultStyles> <CSS value="border-collapse:collapse;width:100%"/> </style> </table> </contents> <style> <defaultStyles> <defaultStyle refStyle="pf"/> </defaultStyles> <CSS value="padding-top:10px"/> </style> </pageFooter> </page> </reportPages> </layout> </layouts> <XMLAttributes><XMLAttribute name="RS_CreateExtendedDataItems" value="true" output="no"/><XMLAttribute name="listSeparator" value="," output="no"/><XMLAttribute name="RS_modelModificationTime" value="2011-06-09T13:51:23.483Z" output="no"/></XMLAttributes></report> |
Monday, September 20, 2010
Drill Through Pointers for Deployment
Recently, I updated and moved a drill through target report. Soon after, the Source reports started throwing drill-through links error. It was later that I figured out that the target report was moved to another folder prior to import for back-up reasons. This caused the drill through in the source reports to break.
A drill through source and target report are linked to each other through their search path and ID. When a target report is moved around the source report specification gets updated with the new path without having to make any modifications.
This can prove to be a boon to some and a head-ache to others while doing deployment. The administrator doing deployments should keep the above point in mind while doing deployments involving drill through reports else it will result in broken links.
Let us consider a scenario where we have a Source Report and a Target report in Folder 1.
Now the Target Report is moved to My Folders and a newer version of the Target Report is imported into Folder 1. The Source report link to the target report breaks.
The reason is when the Target report was moved to My Folders the Source report re-pointed to this new path. Even when a newer version of the report with the same name is pasted in the same path the Source report's spec points to the new path and the Source Report ID is linked to the older Target report ID. Hence the links from the Source report break.
Until the ID connection between the Source and Target Report is lost, deployments need to be done carefully. An ID association is lost when Target report is deleted or the Target report is imported separate from the source report.
A drill through source and target report are linked to each other through their search path and ID. When a target report is moved around the source report specification gets updated with the new path without having to make any modifications.
This can prove to be a boon to some and a head-ache to others while doing deployment. The administrator doing deployments should keep the above point in mind while doing deployments involving drill through reports else it will result in broken links.
Let us consider a scenario where we have a Source Report and a Target report in Folder 1.
Now the Target Report is moved to My Folders and a newer version of the Target Report is imported into Folder 1. The Source report link to the target report breaks.
The reason is when the Target report was moved to My Folders the Source report re-pointed to this new path. Even when a newer version of the report with the same name is pasted in the same path the Source report's spec points to the new path and the Source Report ID is linked to the older Target report ID. Hence the links from the Source report break.
Until the ID connection between the Source and Target Report is lost, deployments need to be done carefully. An ID association is lost when Target report is deleted or the Target report is imported separate from the source report.
Friday, September 10, 2010
Multiple Queries for different data containers
Why does Cognos generate Multiple Queries even when you have the different objects pointed to the same Query?
Now before I answer that, here's a tip to all the newbies on seeing the SQLs generated by Cognos. To identify the SQLs that Cognos fires against the DB, you would need to use the Generated SQL/MDX option from the Tools Menu and not the one available for each individual Query Object in Cognos. This will display the various SQLs that Cognos would fire against the DB and here's where you would notice that even if you have multiple objects pointing to the same Query Object, Cognos would still fire multiple queries.
Now if you have say multiple lists all pointing to the same Query Object, then unless the columns pulled in, the sort order, group by clauses match, Cognos would fire multiple queries. To make the columns pulled in to match you could use the Properties property of the lists and select all those columns that are missing from each list to match the other lists based out of the same Query Subject.
In case you have different data containers say like a list and crosstab or a bar graph and crosstab pointing to the same query subject Cognos would fire multiple queries as that's how its designed. So no luck over there.
Now before I answer that, here's a tip to all the newbies on seeing the SQLs generated by Cognos. To identify the SQLs that Cognos fires against the DB, you would need to use the Generated SQL/MDX option from the Tools Menu and not the one available for each individual Query Object in Cognos. This will display the various SQLs that Cognos would fire against the DB and here's where you would notice that even if you have multiple objects pointing to the same Query Object, Cognos would still fire multiple queries.
Now if you have say multiple lists all pointing to the same Query Object, then unless the columns pulled in, the sort order, group by clauses match, Cognos would fire multiple queries. To make the columns pulled in to match you could use the Properties property of the lists and select all those columns that are missing from each list to match the other lists based out of the same Query Subject.
In case you have different data containers say like a list and crosstab or a bar graph and crosstab pointing to the same query subject Cognos would fire multiple queries as that's how its designed. So no luck over there.
Friday, September 3, 2010
New Dynamic Sorting Controls from IBM Cognos
Cognos seems to have come up with new controls / method for dynamic column sorting that doesn't involve the drill through methods.
Here's the link to the article.
http://www.ibm.com/developerworks/data/library/cognos/reporting/scripting_techniques/page515.html?ca=drs-
Sounds cool!! Hope this ends the report re-run issues we have while using the drill-through method for sorting.
I will check this method out and post my comments later.
Let me know your thoughts on this method though.
Here's an update from PaulM about applying this method on Cognos 10 environment:
[User comments: PaulM]
Here's the link to the article.
http://www.ibm.com/developerworks/data/library/cognos/reporting/scripting_techniques/page515.html?ca=drs-
Sounds cool!! Hope this ends the report re-run issues we have while using the drill-through method for sorting.
I will check this method out and post my comments later.
Let me know your thoughts on this method though.
Here's an update from PaulM about applying this method on Cognos 10 environment:
[User comments: PaulM]
There are some differences between the XML files from Cognos 8 to Cognos 10. The general structure of the files are the same, so making the correct updates is not difficult.
First you need to use a decent text editor, one with comparison capabilities. Notepad++ is excellent for this.
Open both the C10 version and the version from the download. They should be in different tabs in Notepad++. Click on the plugins menu and select Compare --> Compare.
Find and copy the differences related to this technique only. Don't try to copy deprecated objects over. All of the related fragments have "orderCol" somewhere in the name.
Tuesday, August 31, 2010
Using CAMID for Bursting
To burst reports to Cognos Directories that need to be visible to all users with access to the respective folder / report, you would need to burst to the Everyone group.
Retrieve the CAMID of everyone group and create a data item with the CAMID included as an expression. Set this as the recipient data item.
'CAMID("::Everyone")'
Likewise you can burst to various groups of users by retrieving their CAMIDs and setting them up as recipients in your database recipient table.
Retrieve the CAMID of everyone group and create a data item with the CAMID included as an expression. Set this as the recipient data item.
'CAMID("::Everyone")'
Likewise you can burst to various groups of users by retrieving their CAMIDs and setting them up as recipients in your database recipient table.
Tuesday, August 24, 2010
Master Query Cannot Be a Nested Query - Error Message
When you try bursting a report that has multiple pages with page breaks set on different queries, the report could throw errors "Master Query Cannot Be a Nested Query" unless you have master detail relationships set up between the Burst query and the Page Set Queries.
Master Detail relationships can be set up using the "Master Detail Relationships" property of the Page Set query.
Master Detail relationships can be set up using the "Master Detail Relationships" property of the Page Set query.
Tuesday, August 17, 2010
Single Select Tree Prompt on a Relational Model
You can base Tree prompts not just on Dimensional Model but on relational Model as well. This involves using macros and decoding MUNs. This has been covered in article on IBM's support site but I am covering the same here for the benefit of those who use this blog as a one-stop-site for all Cognos needs.
The below solution works for single select Tree Prompt. I am working on multi-select Tree Prompts and would update this article once I have the solution.
Requirement: Create a Date Dimension Tree Prompt based on a Relational Date Dimension Table and filter the report to retrieve data for the selected dates.
Solution:
Step 1: In the FM Model, create a stand alone Date Regular Dimension based on the existing Date Dimension query subject in the Presentation Layer View.
This Dimension would be used for creating the Tree Prompt.
Step 2: In the report create a tree prompt based on Date Dimension.
Step 3: Create a filter as below:
[Date] = #csv(substitute(']','',grep(']',split('].', grep('all',split('->',prompt('dte','memberuniquename','[Presentation Layer].[Date Dimension].[YEAR].[DATE]->[all].[2008].[Q1 2008].[Jan 2008].
[Wk 1].[2008-01-01]')))))))#
The above filter works as explained:
Thus [Date] is compared with 2008-01-01 and the filter works as expected.
The below solution works for single select Tree Prompt. I am working on multi-select Tree Prompts and would update this article once I have the solution.
Requirement: Create a Date Dimension Tree Prompt based on a Relational Date Dimension Table and filter the report to retrieve data for the selected dates.
Solution:
Step 1: In the FM Model, create a stand alone Date Regular Dimension based on the existing Date Dimension query subject in the Presentation Layer View.
This Dimension would be used for creating the Tree Prompt.
Step 2: In the report create a tree prompt based on Date Dimension.
Step 3: Create a filter as below:
[Date] = #csv(substitute(']','',grep(']',split('].', grep('all',split('->',prompt('dte','memberuniquename','[Presentation Layer].[Date Dimension].[YEAR].[DATE]->[all].[2008].[Q1 2008].[Jan 2008].
[Wk 1].[2008-01-01]')))))))#
The above filter works as explained:
- First the MUN for the selected Tree Prompt value is retrieved.
In the above Example: [Presentation Layer].[Date Dimension].[YEAR].[DATE]->[all].[2008].[Q1 2008].[Jan 2008].[Wk 1].[2008-01-01] - Then we split the retrieved MUN into 2 parts based on the symbol "->".
In the above Example: The MUN is split into [Presentation Layer].[Date Dimension].[YEAR].[DATE], [all].[2008].[Q1 2008].[Jan 2008].[Wk 1].[2008-01-01] - We then select the part containing "all".
In the above Example: [all].[2008].[Q1 2008].[Jan 2008].[Wk 1].[2008-01-01] is selected. - Then split the part again into multiple parts based on the symbol "].[".
In the above Example: [all].[2008].[Q1 2008].[Jan 2008].[Wk 1].[2008-01-01] is split into [all, 2008, Q1 2008, Jan 2008, Wk 1, 2008-01-01] - Retrieve the part containing "]".
In the above Example: 2008-01-01] is selected. - Replace "]" with "".
In the above Example: 2008-01-01] is converted to 2008-01-01.
Thus [Date] is compared with 2008-01-01 and the filter works as expected.
Monday, August 16, 2010
User Query - Display Totals for Top 2 Ranks while displaying data for all other ranks in a crosstab?
This is in response to a user query.
Requirement: Display Ranks across Columns in a Crosstab and the Total column needs to include only the Top 2 ranks along with Overall Total column for all Ranks.
Solution:
Step 1: Create the base query Query 1 with the following data items: Product, Year, Sales, Rnk, Top 2 Sales Total.
Step 2: Create the outer query Query 2 by referencing the base query Query 1. Drag the data items - Product, Year, Sales, Rnk, Top 2 Sales Total from the referenced query.
Step 3: Set the aggregate function of Rnk to None.
Step 4: Create the crosstab by dragging in the data items Product, Sales, Top 2 Sales Total from Query 2.
Step 5: Create the Overall Total column by clicking on the Summary button for Rank column.
Requirement: Display Ranks across Columns in a Crosstab and the Total column needs to include only the Top 2 ranks along with Overall Total column for all Ranks.
Solution:
Step 1: Create the base query Query 1 with the following data items: Product, Year, Sales, Rnk, Top 2 Sales Total.
Rnk - Rank([Sales] for [Product])
Top 3 Sales Total - case when [Rnk] < 3 then [Sales] end
Step 2: Create the outer query Query 2 by referencing the base query Query 1. Drag the data items - Product, Year, Sales, Rnk, Top 2 Sales Total from the referenced query.
Step 3: Set the aggregate function of Rnk to None.
Step 4: Create the crosstab by dragging in the data items Product, Sales, Top 2 Sales Total from Query 2.
Step 5: Create the Overall Total column by clicking on the Summary button for Rank column.
Wednesday, August 11, 2010
User Query - Creating Overall Percentage Summary Values in Reports
This requirement is in response to a user query:
Requirement: Display Dates, Sales, Percentage Sales along with summary total.
Solution:
Step 1: Create a crosstab report with Year, Month, Sales data items.
Step 2: Create a data item Total Sales - total(...[Sales] for report).
Step 3: Set the Aggregate and Rollup Aggregate functions to calculated.
Step 4: Create a data item Percentage - [Sales] / [Total Sales] and set aggregate and rollup to calculated. Drag this query item into the crosstab.
Here are a couple of other ways to format the above information:
Requirement: Display Dates, Sales, Percentage Sales along with summary total.
Solution:
Step 1: Create a crosstab report with Year, Month, Sales data items.
Step 2: Create a data item Total Sales - total(...[Sales] for report).
Step 3: Set the Aggregate and Rollup Aggregate functions to calculated.
Step 4: Create a data item Percentage - [Sales] / [Total Sales] and set aggregate and rollup to calculated. Drag this query item into the crosstab.
Here are a couple of other ways to format the above information:
Thursday, August 5, 2010
Cognos 8.4 Query Studio Bug - Including Model and Query Filter
Recently I ran into an issue with Cognos 8.4 Query Studio. When I try to include both a model filter and a query studio filter the query studio report just hangs forever and then it throws an "object not found" error on the Model filter even though the object exists in the model.
This has been identified as a bug in Cognos 8.4 and seems to have been corrected in version 8.4.1.
So here's one more reason to move to 8.4.1.
This has been identified as a bug in Cognos 8.4 and seems to have been corrected in version 8.4.1.
So here's one more reason to move to 8.4.1.
Labels:
Cognos,
Cognos 8,
Cognos 8.4,
Cognos 8.4.1,
Query Studio
Monday, August 2, 2010
Tree Prompts Explained
A Tree Prompt provides the user with ability to navigate levels in a hierarchy easily and select members across Levels or even select multiple levels within a hierarchy.
I have covered below a few points on Tree Prompts
I have covered below a few points on Tree Prompts
- To set up a tree prompt filter, in your report embed a prompt function inside a data item
Example: To display only a selected Product Hierarchy members / levels and associated revenue:
[Sales and Marketing].[Products].[Products]->?Products? - To create a multiselect tree prompt, change the expression to -
set([Sales and Marketing].[Products].[Products]->?Products?) - To display members of levels starting from a particular level -
Example: Display a Tree Prompt that allows users to select members or levels starting from Product Type.
In the query associated with your tree prompt, drag the level from which you wish to be able to display members and descendants and set this as the use value for your tree prompt. - Default values can only be hard-coded selections.
- To use a Tree Prompt in a Macro -
set(#promptmany('Products','memberuniquename','[Sales and Marketing].[Products].[Products].[Product type]->:[PC].[@MEMBER].[951]')#) - Tree prompts are based on Dimensional model, but you can still create Tree Prompts using a Relational Model which I shall explain in a later article.
Monday, July 26, 2010
Identifying nth Weekday of a month in reports
Requirement: Identify the 2nd Thursday of the current month.
Solution:
Create data items ToDate, 1stDateofMonth, 1stDayofMonth, Nth, Day , AddDays, NthDay
ToDate - current_date
1stDateofMonth - _first_of_month(ToDate)
1stDayofMonth - _day_of_week(1stDateofMonth ,1) // Assuming Monday is Day 1
Nth - 2 // The nth value of the weekday required, in our case we require 2nd Thursday
Day - 4 // Assuming Monday is Day 1, then Thursday is Day 4
AddDays - case when [Day] >= [1stDayofMonth]
then [Day] - [1stDayofMonth]
else
([Day] + [1stDayofMonth]) -1
end
NthDay - _add_days([1stDateofMonth],(([Nth]-1) * 7 ) + [AddDay])
NthDay returns the 2nd Thursday of the month.
Solution:
Create data items ToDate, 1stDateofMonth, 1stDayofMonth, Nth, Day , AddDays, NthDay
ToDate - current_date
1stDateofMonth - _first_of_month(ToDate)
1stDayofMonth - _day_of_week(1stDateofMonth ,1) // Assuming Monday is Day 1
Nth - 2 // The nth value of the weekday required, in our case we require 2nd Thursday
Day - 4 // Assuming Monday is Day 1, then Thursday is Day 4
AddDays - case when [Day] >= [1stDayofMonth]
then [Day] - [1stDayofMonth]
else
([Day] + [1stDayofMonth]) -1
end
NthDay - _add_days([1stDateofMonth],(([Nth]-1) * 7 ) + [AddDay])
NthDay returns the 2nd Thursday of the month.
Monday, July 19, 2010
Required a Property for Relational querying !!!
I so wish there was a Property or setting for Relational Querying vs Dimensional Querying. We are dealing with a model that is ER modeled and is based on an OLTP system. We need to build an ad-hoc model based on this. And as you know this model is not supported by Cognos. So don't you think if there was a setting in FM that instructed Cognos to generate ER based queries rather than identifying facts and dimensions based on cardinalities that would make life so much simpler.
I mean, just think if there are 3 tables A, B, C and they are joined to each other, you would expect a query that relates the 3 tables based on the joins. I do not want Cognos to identify if A/B/C is a fact / dimensions and break-up my query.
I know you would ask me to move to a dimensional model but in our case it doesn't even make sense. The OLTP system doesn't even deal with millions of rows nor do we have an ETL system. SO moving to a dimensional model would mean a lot of expenditure in terms of architecting, getting an ETL tool, designing and developing ETL workflows, time, resources involved and so on...Just wish there was a setting..
I mean, just think if there are 3 tables A, B, C and they are joined to each other, you would expect a query that relates the 3 tables based on the joins. I do not want Cognos to identify if A/B/C is a fact / dimensions and break-up my query.
I know you would ask me to move to a dimensional model but in our case it doesn't even make sense. The OLTP system doesn't even deal with millions of rows nor do we have an ETL system. SO moving to a dimensional model would mean a lot of expenditure in terms of architecting, getting an ETL tool, designing and developing ETL workflows, time, resources involved and so on...Just wish there was a setting..
Tuesday, July 13, 2010
Practical Usage for Sectioning
Most often you would have used Sectioning for formatting reasons. Here's how you could use Sectioning, Page-break concepts to make a single list report work like 2 lists.
Requirement: Display Data for Current Year and Previous Year as 2 separate lists on 2 separate pages.
Normally you would have used 2 separate queries for the 2 list reports. Using Sectioning and Page-Break you can use a single query to achieve this. You can use this method when you need to display multiple lists that differ in terms of filters alone.
Solution:
Step 1: Drag Product, Sales, Year in to the report
Step 2: Create a filter to filter the data for current and previous year.
Step 3: Create a data item called Section Title.
Step 4: Section on the Section Title data item
Step 5: Create a page break on Section Title data item.
Requirement: Display Data for Current Year and Previous Year as 2 separate lists on 2 separate pages.
Normally you would have used 2 separate queries for the 2 list reports. Using Sectioning and Page-Break you can use a single query to achieve this. You can use this method when you need to display multiple lists that differ in terms of filters alone.
Solution:
Step 1: Drag Product, Sales, Year in to the report
Step 2: Create a filter to filter the data for current and previous year.
Step 3: Create a data item called Section Title.
case when [YEAR] = extract(year from current_date)
then 'Current Year'
else 'Previous Year'
end
Step 4: Section on the Section Title data item
Step 5: Create a page break on Section Title data item.
Friday, July 9, 2010
Limited Cache, Query Re-Use Capabilities
I have always had a major complaint with Cognos and that is with regards to caching and query re-use. I am not sure if this is the case with other BI tools as well. And it is really frustrating when you have to explain this to the users.
For instance you are asked to implement dynamic sorting on your report that includes a number of columns and retrieves a huge amount of data. Now with Dynamic sorting each sort fires a query against the DB which is performance impacting. And when you explain this to the users their query is "Why doesn't Cognos just retrieve the data and push it to the client side, so that each sort would not hit the DB?" Good question, but that is not for me to answer.
Likewise imagine a scenario where you have to display 3 Lists in your report one each for YTD, QTD and WTD. Ideally what I would like to do is to get the YTD data and QTD, WTD would all be a subset of the YTD data. So if the YTD data can be retrieved and cached and this data re-used for the other queries that would be great. But unfortunately that is not the case with Cognos. When I create a query object and point other query objects to this the base query still gets executed multiple times.
These are all required features in Cognos.
The only cache and query re-use capabilities Cognos offers is when you have multiple lists all pointing to the same query and having the same order by, group by and columns which is ever rarely the case.
For instance you are asked to implement dynamic sorting on your report that includes a number of columns and retrieves a huge amount of data. Now with Dynamic sorting each sort fires a query against the DB which is performance impacting. And when you explain this to the users their query is "Why doesn't Cognos just retrieve the data and push it to the client side, so that each sort would not hit the DB?" Good question, but that is not for me to answer.
Likewise imagine a scenario where you have to display 3 Lists in your report one each for YTD, QTD and WTD. Ideally what I would like to do is to get the YTD data and QTD, WTD would all be a subset of the YTD data. So if the YTD data can be retrieved and cached and this data re-used for the other queries that would be great. But unfortunately that is not the case with Cognos. When I create a query object and point other query objects to this the base query still gets executed multiple times.
These are all required features in Cognos.
The only cache and query re-use capabilities Cognos offers is when you have multiple lists all pointing to the same query and having the same order by, group by and columns which is ever rarely the case.
Wednesday, July 7, 2010
Bursting Explained - Part III
Links to Part I, Part II.
Scenario 3: Separate Burst Group, Burst Recipient and Report Queries.
In this case all 3 queries are separate.
Example: Burst a Sales report based on Sales Regions that has 2 lists in them and make all the sales reports available to all users through Cognos Connection Directories.
BurstGroupQuery (Master Burst Group Query) - Sales Region
ListQuery1 (Detail List Query 1) - Year, Sales Region, Revenue
ListQuery2 (Detail List Query 2) - Year, Sales Region, Sales Target
RecipientQuery (Detail Recipient Query) - Sales Region, Recipient
Create Master-Detail Relationships between Master Burst Group query and Detail List Queries. Create Master-Detail Relationship between Master Burst Group query and Detail Recipient Query.
In this example the master burst group query is run once, the data set is retrieved, for each record retrieved the recipient query is run once and the data set is retrieved. Again, for the master query the detail list queries are run once each for each record and data is retrieved.
In the above case:
Master Query - Run Once, 100 records retrieved.
Recipient Query - Run 100 times and 1 record retrieved each time.
Report Queries - Run 100 times * 2 report queries
Scenario 3: Separate Burst Group, Burst Recipient and Report Queries.
In this case all 3 queries are separate.
Example: Burst a Sales report based on Sales Regions that has 2 lists in them and make all the sales reports available to all users through Cognos Connection Directories.
BurstGroupQuery (Master Burst Group Query) - Sales Region
ListQuery1 (Detail List Query 1) - Year, Sales Region, Revenue
ListQuery2 (Detail List Query 2) - Year, Sales Region, Sales Target
RecipientQuery (Detail Recipient Query) - Sales Region, Recipient
Create Master-Detail Relationships between Master Burst Group query and Detail List Queries. Create Master-Detail Relationship between Master Burst Group query and Detail Recipient Query.
In this example the master burst group query is run once, the data set is retrieved, for each record retrieved the recipient query is run once and the data set is retrieved. Again, for the master query the detail list queries are run once each for each record and data is retrieved.
In the above case:
Master Query - Run Once, 100 records retrieved.
Recipient Query - Run 100 times and 1 record retrieved each time.
Report Queries - Run 100 times * 2 report queries
Tuesday, July 6, 2010
Bursting Explained - Part II
Link to Part I, Part III.
Scenario 2: A single Burst Group and Burst Recipient Query but multiple report queries.
In this case the report has multiple queries. The burst group and burst recipient information is fed by a single query.
Example: Burst a Sales report that has 2 lists based on Sales Regions and make the report available to users through Cognos Connection Directories.
RecipientQuery (Master Query) - Sales Region, Recipient
ListQuery1 (List Query 1) - Year, Sales Region, Revenue
ListQuery2 (List Query 2) - Year, Sales Region, Sales Target
You would need to create master-detail relationships between the Master Query and the 2 List queries. The Burst Group and Recipient query would be the master query.
In the above example, since the burst group and burst recipient is sourced from a single query, this query is run once before the start of bursting process. Then for each record retrieved by the burst query the detail queries are fired once each.
Say we have 100 sales regions. The master query is fired once and the 100 records are retrieved. Then for each Sales region the 2 detail queries are fired. So you would see a total of 2 * 100 queries fired against the DB in sequence.
It is in the above scenario that you would see the entire burst process taking time to complete. Assume each report output is generated in 1 minute. The entire burst process in the above example is completed in 100 minutes and that is like more than an hour.
Scenario 2: A single Burst Group and Burst Recipient Query but multiple report queries.
In this case the report has multiple queries. The burst group and burst recipient information is fed by a single query.
Example: Burst a Sales report that has 2 lists based on Sales Regions and make the report available to users through Cognos Connection Directories.
RecipientQuery (Master Query) - Sales Region, Recipient
ListQuery1 (List Query 1) - Year, Sales Region, Revenue
ListQuery2 (List Query 2) - Year, Sales Region, Sales Target
You would need to create master-detail relationships between the Master Query and the 2 List queries. The Burst Group and Recipient query would be the master query.
In the above example, since the burst group and burst recipient is sourced from a single query, this query is run once before the start of bursting process. Then for each record retrieved by the burst query the detail queries are fired once each.
Say we have 100 sales regions. The master query is fired once and the 100 records are retrieved. Then for each Sales region the 2 detail queries are fired. So you would see a total of 2 * 100 queries fired against the DB in sequence.
It is in the above scenario that you would see the entire burst process taking time to complete. Assume each report output is generated in 1 minute. The entire burst process in the above example is completed in 100 minutes and that is like more than an hour.
Friday, July 2, 2010
Bursting Explained - Part I
I have been asked by so many users what happens behind the scenes when Cognos bursts a report? Why do burst reports take longer to complete? etc. And I have explained about master-detail relationships and its impact on Burst reports in an earlier post. In this post and the next couple of posts I am going to try and cover the various burst scenarios and how many SQLs Cognos fires against the backend in each scenario.
Bursting is made up of 3 components:
The Burst Query or Report Query: The report query that needs to be burst. The report could be a single query report or multi-query report. In case of a multi-query report, each data container in the report needs to have a master-detail relationship set up with the burst group query.
The Burst Group Query: The query that decides the burst group. For each record in the burst group query the report is sliced.
The Burst Recipient Query: The query that provides the recipient information.
Scenario 1: A single Burst Query, Burst Group Query, Burst Recipient Query.
In this case the list in the report, the burst group and the burst recipient information is fed by a single query.
Example: Burst a Sales report based on Sales Regions to be made available to users through Cognos Connection Directory.
Query 1 Data Items: Year, Sales Region, Sales, Recipient.
Recipient - CAMID('Everyone')
In the above Example, since burst group, burst recipient and list report are sourced from a single query, Cognos runs this query once, retrieves the entire data set and then slices the report output for the various sales regions. Hence only 1 query is fired by Cognos in this case.
Links to Part II, Part III
Bursting is made up of 3 components:
The Burst Query or Report Query: The report query that needs to be burst. The report could be a single query report or multi-query report. In case of a multi-query report, each data container in the report needs to have a master-detail relationship set up with the burst group query.
The Burst Group Query: The query that decides the burst group. For each record in the burst group query the report is sliced.
Example: A report needs to be burst for each Sales Regions. In this case the burst group would be the Sales Region. The report is sliced for each Sales Region record generated by the burst group query.
The Burst Recipient Query: The query that provides the recipient information.
Scenario 1: A single Burst Query, Burst Group Query, Burst Recipient Query.
In this case the list in the report, the burst group and the burst recipient information is fed by a single query.
Example: Burst a Sales report based on Sales Regions to be made available to users through Cognos Connection Directory.
Query 1 Data Items: Year, Sales Region, Sales, Recipient.
Recipient - CAMID('Everyone')
In the above Example, since burst group, burst recipient and list report are sourced from a single query, Cognos runs this query once, retrieves the entire data set and then slices the report output for the various sales regions. Hence only 1 query is fired by Cognos in this case.
Links to Part II, Part III
Monday, June 28, 2010
Event Studio 8.4 Issues
I am running into a weird issue while using Event Studio. There have been a couple of issues that I have noticed while working with Event Studio [Cannot Burst Report Views with Cognos 8.4]. But the latest one is the strangest.
I have reports in a particular folder in prod. Since we are on 8.4 and not 8.4.1 we cannot burst report views. Hence we burst the reports but save the output in report view by setting the option "Save as Report View" to checked.
Now we need to have some reports emailed to the users. The "Attach link" doesn't work in E Mail Task as we have 8 report task in our event. So we created 2 events and included the 1st event as a task in the second event to get them all to run as a single event would.
This throws an error every time we run the event. The error message reads "..do not have permissions to update/delete the object...." and the path provided is that of the report. Obviously we do not have write permissions to the report folder.
But the question is why is it trying to save / delete the existing report?? Testing this out on dev where we have write permissions on the report folder works fine. The report view is created and report outputs are saved in the report view but then the Modified Date of the original report is changed. This is not expected and is unacceptable in our case.
This happens in the case of master-child event scenario only. If we run both the events separately one after the other no issues but the original report's modified date is changed. We are planning on moving to 8.4.1 so that we can burst the report views and have the original report untouched.
Any one else has faced this issue?
I have reports in a particular folder in prod. Since we are on 8.4 and not 8.4.1 we cannot burst report views. Hence we burst the reports but save the output in report view by setting the option "Save as Report View" to checked.
Now we need to have some reports emailed to the users. The "Attach link" doesn't work in E Mail Task as we have 8 report task in our event. So we created 2 events and included the 1st event as a task in the second event to get them all to run as a single event would.
This throws an error every time we run the event. The error message reads "..do not have permissions to update/delete the object...." and the path provided is that of the report. Obviously we do not have write permissions to the report folder.
But the question is why is it trying to save / delete the existing report?? Testing this out on dev where we have write permissions on the report folder works fine. The report view is created and report outputs are saved in the report view but then the Modified Date of the original report is changed. This is not expected and is unacceptable in our case.
This happens in the case of master-child event scenario only. If we run both the events separately one after the other no issues but the original report's modified date is changed. We are planning on moving to 8.4.1 so that we can burst the report views and have the original report untouched.
Any one else has faced this issue?
Thursday, June 24, 2010
Implementing Dynamic Sorting with Dimension Model??
I am trying to implement dynamic sorting but using a dimensional model. Not having much luck here. Written to various Cognos forums as well but no help there.
Any help would be greatly appreciated.
Any help would be greatly appreciated.
Including Comments in Data Items
Here's a tip for those who would like to include comments in data items that include complex calculations or nested case / if-then-else statements:
The syntax for including comments - #/*Include comments here*/#
The syntax for including comments - #/*Include comments here*/#
Wednesday, June 23, 2010
Disabling Drill Through on Totals in Crosstab
If you have Crosstab Reports with drill through on the measure and you also have Totals. You would notice that the drill through is applied on the Totals column as well. If you try removing the drill through on the Totals the drill through on the measures also gets removed.
Solution:
Step 1: In the Total Column, set the Define Contents property to Yes. The measure value disappears in this column.
Step 2: Now re-drag the total measure into the column from Insertable Objects after unlocking the column.
Step 3: Navigate to Crosstab intersection for the column and set the Source Type to Cell Value.
The Drill Through Link is no longer applied on the Total Column.
Solution:
Step 1: In the Total Column, set the Define Contents property to Yes. The measure value disappears in this column.
Step 2: Now re-drag the total measure into the column from Insertable Objects after unlocking the column.
Step 3: Navigate to Crosstab intersection for the column and set the Source Type to Cell Value.
The Drill Through Link is no longer applied on the Total Column.
Monday, June 21, 2010
Celebrating 50 Posts!!!
Hurrah!!! This is my 50th post. When I started this blog, I didn't think that I would reach this far in this short amount of time. Thought I would run out of articles to blog about after a couple of months. And now I have at least 10 more articles planned that I need to write about.
I would like to thank all of you for your support and hope you have found this blog useful in some way. In case there are any articles in particular that you would like me to cover please feel free to let me know and I will do my best addressing them.
I would like to thank all of you for your support and hope you have found this blog useful in some way. In case there are any articles in particular that you would like me to cover please feel free to let me know and I will do my best addressing them.
Drill through to different reports on clicking of different series in a chart
Requirement: A chart report displays Sales and Revenue. On clicking Sales value, users should be able to drill through to detailed Sales report. Likewise users should be able to view detailed Revenue report on clicking of Revenue series.
Solution:
Step 1: Create Source report with Data Items - Year, Sales, Revenue.
Step 2: Create 2 data Items - FlagA, FlagB for Sales series and Revenue Series.
FlagA - '1'
FlagB - '2'
Step 3: Nest FlagA data item under Sales Series and FlagB data item under Revenue series.
Step 4: Create 1 Target Report, with 2 lists - one for Sales and one for Revenue and include filter on Year to show up the report for the corresponding year passed from the source report.
Step 5: In Query 1 associated with Sales include filter ?FlagA? is not null to force the parameter to show up in the drill through definition. Likewise in Query 2 include filter ?FlagA? is not null.
Step 6: Create a render variable to conditionally display the required list.
Step 7: Apply render variable on the lists to display accordingly.
Step 8: In the source report, set up the drill through link to pass the Year, FlagA and FlagB data item.
Note: The legend displays the value of FlagA and FlagB concatenated with the Sales and Revenue Keyword. In case you wish to avoid this, then you would need to create a manual legend (design outside of the chart).
Solution:
Step 1: Create Source report with Data Items - Year, Sales, Revenue.
Step 2: Create 2 data Items - FlagA, FlagB for Sales series and Revenue Series.
FlagA - '1'
FlagB - '2'
Step 3: Nest FlagA data item under Sales Series and FlagB data item under Revenue series.
Step 4: Create 1 Target Report, with 2 lists - one for Sales and one for Revenue and include filter on Year to show up the report for the corresponding year passed from the source report.
Step 5: In Query 1 associated with Sales include filter ?FlagA? is not null to force the parameter to show up in the drill through definition. Likewise in Query 2 include filter ?FlagA? is not null.
Step 6: Create a render variable to conditionally display the required list.
case when ?FlagA? is not null then 'A' else 'B' end
Step 7: Apply render variable on the lists to display accordingly.
Step 8: In the source report, set up the drill through link to pass the Year, FlagA and FlagB data item.
Note: The legend displays the value of FlagA and FlagB concatenated with the Sales and Revenue Keyword. In case you wish to avoid this, then you would need to create a manual legend (design outside of the chart).
Friday, June 18, 2010
Cannot Burst Report Views with Cognos 8.4
We are on Cognos 8.4 and I received a bad setback today when I was trying to burst report views. The option to Burst Report Views is no longer available in 8.4. Then when I looked up Support Site there is a PMR on this issue and the suggestion is to move to 8.4.1.
For all those on 8.4, a move to 8.4.1 is suggested.
For all those on 8.4, a move to 8.4.1 is suggested.
Tuesday, June 15, 2010
Tree Prompt Dynamic Default Selection
I have a requirement to default my tree prompt to a dynamic value say current year. This is not achievable directly so I tried looking up any existing JavaScript codes but with no luck.
Hence I was forced to dig through the Cognos Source Code to see if any of the internal functions could be used and I did come across a function that could be used to set the default values. There are still some issues with this code for example deselecting a child node doesn't deselect the parent node if no other child nodes are selected.
But the users are willing to live with this. I am not going to post this code unless someone desperately requires the same as I would not like anyone to face issues either upgrading the reports or have this code interfere with any other code in the report. This is not a tried and tested method as well. In case I don't run into any issues with this code and if I am able to iron out the other minor issues I will post the same.
Hence I was forced to dig through the Cognos Source Code to see if any of the internal functions could be used and I did come across a function that could be used to set the default values. There are still some issues with this code for example deselecting a child node doesn't deselect the parent node if no other child nodes are selected.
But the users are willing to live with this. I am not going to post this code unless someone desperately requires the same as I would not like anyone to face issues either upgrading the reports or have this code interfere with any other code in the report. This is not a tried and tested method as well. In case I don't run into any issues with this code and if I am able to iron out the other minor issues I will post the same.
Monday, June 14, 2010
Metadata Callbacks due to Table Type not being Set
Recently while testing my FM Model I noticed that one of the query subject was causing metadata callbacks and this was with the message "The metadata for...will be retrieved from the Database due to the table type not set for query subject...". I verified that the model query subject did not have any relationships to other query subjects. The data source query subjects did not have any filters / determinants / calculations.
It was only when I tried re-creating the query subject that I figured out the cause for this as being a difference in the case on the table name in the Query Subject and in the Schema. The Query Subject had the table name as "TABLE" while in the schema it was "table".
Once I modified the data source query subject to point to "table" as in the schema, there were no metadata callbacks.
It was only when I tried re-creating the query subject that I figured out the cause for this as being a difference in the case on the table name in the Query Subject and in the Schema. The Query Subject had the table name as "TABLE" while in the schema it was "table".
Once I modified the data source query subject to point to "table" as in the schema, there were no metadata callbacks.
Using Parameter Maps to implement Data Masking
Here's a real world example of using Parameter Maps to implement data masking as well as disadvantages of the below solution.
Requirement: Data is stored in 2 different schemas with the same table names. One schema contains Secured Data and the other Unsecured or masked data.
Users have access to either Secured Data or Unsecured Data. This information is stored in a security table. Users need to be dynamically pointed to either of the schemas depending on their security level clearance.
Solution:
Step 1:The security table contains the following columns - UsrID, HasSecurityClearanceFlag.
Step 2:In FM create a data source query Subject called SecurityLevel based on the above table.
Step 3:Create a Parameter map called Usr_Access based on the above Query Subject with key set to UsrID and value set to HasSecurityClearanceFlag.
Step 4:Create a model Query Subject called SchemaName with columns - UsrID, HasSecurityClearanceFlag, Schema.
Schema -
Step 5:Create another Parameter Map called Schema based on the above Model Query Subject with key set to UsrID and value set to Schema.
Step 6:In Query Subjects that need to be dynamically re-pointed to the schemas modify the SQL as below:
Disadvantages:
Modifying the Data Source Query Subjects results in Metadata Callbacks and this is a cause of concern.
I am still trying to find a viable solution wherein there are no callbacks and yet we can achieve the above security requirement.
Any suggestions?
Requirement: Data is stored in 2 different schemas with the same table names. One schema contains Secured Data and the other Unsecured or masked data.
Users have access to either Secured Data or Unsecured Data. This information is stored in a security table. Users need to be dynamically pointed to either of the schemas depending on their security level clearance.
Solution:
Step 1:The security table contains the following columns - UsrID, HasSecurityClearanceFlag.
Step 2:In FM create a data source query Subject called SecurityLevel based on the above table.
Step 3:Create a Parameter map called Usr_Access based on the above Query Subject with key set to UsrID and value set to HasSecurityClearanceFlag.
Step 4:Create a model Query Subject called SchemaName with columns - UsrID, HasSecurityClearanceFlag, Schema.
Schema -
case #$Usr_Access{$account.defaultName}# when 1 then 'Schema1'
when 2 then 'Schema2'
else 'DefaultSchema' end
Step 5:Create another Parameter Map called Schema based on the above Model Query Subject with key set to UsrID and value set to Schema.
Step 6:In Query Subjects that need to be dynamically re-pointed to the schemas modify the SQL as below:
Select * from #$Schema{$account.defaultName}#.TableName
Disadvantages:
Modifying the Data Source Query Subjects results in Metadata Callbacks and this is a cause of concern.
I am still trying to find a viable solution wherein there are no callbacks and yet we can achieve the above security requirement.
Any suggestions?
Wednesday, June 9, 2010
Generated Native SQL vs Tools Native SQL
Cognos doesnt always run the Native SQL that you see from the Tools / Generated SQL/MDX Option. There are scenarios wherein it runs the Native SQL that you see from the Query Level Generated SQL options.
I have a report with some columns and count(distincts). The report also includes a footer that should display count(distincts ) at the footer level rather than the sum.
The Native SQL from Tools/Generated SQL/MDX generates a SQL:
and the Native SQL from the Query Level Generated SQL options does not include the footer level calculation. Hence the SQL would look like:
Now I would expect Cognos to fire the Native SQL from the Tools/Generated SQL/MDX options but that is not the case.
Cognos fires the Native SQL from the Query Level Generated SQL. The reason in this case was due to some fields that required Cognos functions. Once I changed the fields to do a DB only processing and included DB only functions Cognos started firing the Native SQL from the Tools option. Not sure if this is a scenario with Teradata only or if this is how Cognos is intended to behave.
Planning on doing some more research on this. Will update when I do so.
Tip: To be on a safer side always verify both the Native SQLs to ensure that both the SQLs are what you would expect to be fired against the DB.
I have a report with some columns and count(distincts). The report also includes a footer that should display count(distincts ) at the footer level rather than the sum.
The Native SQL from Tools/Generated SQL/MDX generates a SQL:
Sel T1.C1 attribute_1, T1.C2 attribute_2,
T1.C3 Qty_A, T0.C3 Qty_A_Footer
from ()T0, ()T1
where....
and the Native SQL from the Query Level Generated SQL options does not include the footer level calculation. Hence the SQL would look like:
Sel attrib_1, attrib_2, count(distinct fact_1) from....
Now I would expect Cognos to fire the Native SQL from the Tools/Generated SQL/MDX options but that is not the case.
Cognos fires the Native SQL from the Query Level Generated SQL. The reason in this case was due to some fields that required Cognos functions. Once I changed the fields to do a DB only processing and included DB only functions Cognos started firing the Native SQL from the Tools option. Not sure if this is a scenario with Teradata only or if this is how Cognos is intended to behave.
Planning on doing some more research on this. Will update when I do so.
Tip: To be on a safer side always verify both the Native SQLs to ensure that both the SQLs are what you would expect to be fired against the DB.
Friday, June 4, 2010
Bursting Charts / Crosstabs
Note: Charts / Crosstabs cannot be bursted using a single query as Burst query, Burst Recipient Query and Crosstab query.
You would need to create a Master Query and use that as Burst query and Burst Recipient query. Create a master detail relationship with the Chart / Crosstab query.
You would need to create a Master Query and use that as Burst query and Burst Recipient query. Create a master detail relationship with the Chart / Crosstab query.
Thursday, June 3, 2010
Teradata - Pointers
When using Teradata, ensure that you thoroughly check the native SQL as well as have a look at the query that Cognos fires against the back-end.
As I have come across situations wherein the Native SQL looks weird cause there are some functions that Cognos doesn't seem to pass to Teradata even though the function used is valid in Teradata.
For example : sel current_time works in Teradata but when used in Cognos this doesn't get passed to the DB. This results in Native SQLs that may not be what is expected.
In one other case I had a master-detail relationship for burst reasons and the master and detail queries were joined on ID columns and a date column. In the query that Cognos fired against the DB I expected to see the detail queries having a filter on the ID columns and the date column. But that was not the case. The join on date column went completely missing from the detail query. This was being handled by the Cognos server rather than the DB.
So to figure out why this was happening, I set the processing to database only on all queries and validated the report.
This threw "requires local processing.." error because of some Cast. The report doesn't have any casted fields. Then I removed the join on Date field in the master-detail relationship. This then validated fine. Next step, I included a cast on date on both sides of the relationship, re-included the relationship and then validated. The report validated perfectly. I bursted the report and then noticed that the detail query included the required date filter.
Here's an advice in case you are using Teradata as your back-end or even otherwise as well. Build your report queries without using any Cognos functions to begin with. Set all queries to database only and then validate the report. After this if required then use Cognos functions. This way you are assured that whatever you expect to get passed to the DB gets passed. So you don't have any surprises like I did.
As I have come across situations wherein the Native SQL looks weird cause there are some functions that Cognos doesn't seem to pass to Teradata even though the function used is valid in Teradata.
For example : sel current_time works in Teradata but when used in Cognos this doesn't get passed to the DB. This results in Native SQLs that may not be what is expected.
In one other case I had a master-detail relationship for burst reasons and the master and detail queries were joined on ID columns and a date column. In the query that Cognos fired against the DB I expected to see the detail queries having a filter on the ID columns and the date column. But that was not the case. The join on date column went completely missing from the detail query. This was being handled by the Cognos server rather than the DB.
So to figure out why this was happening, I set the processing to database only on all queries and validated the report.
This threw "requires local processing.." error because of some Cast. The report doesn't have any casted fields. Then I removed the join on Date field in the master-detail relationship. This then validated fine. Next step, I included a cast on date on both sides of the relationship, re-included the relationship and then validated. The report validated perfectly. I bursted the report and then noticed that the detail query included the required date filter.
Here's an advice in case you are using Teradata as your back-end or even otherwise as well. Build your report queries without using any Cognos functions to begin with. Set all queries to database only and then validate the report. After this if required then use Cognos functions. This way you are assured that whatever you expect to get passed to the DB gets passed. So you don't have any surprises like I did.
Friday, May 28, 2010
Highlight Current, Past and Future Data in Crosstab
Those who have tried conditional formatting in crosstab know that its a pain to do so especially if your condition is based on data items not part of the crosstab.
Requirement: Display for each month, for each weekending date the Revenue spread across the days of the week. Highlight the past weeks and past dates in Current week in Orange and current date and future dates in current week in Yellow and all other Future Dates in white.
Now the challenge is that the dates are not displayed rather the Day of Week. So the only items we have to work with are Weekending Date and the Day of Week.
Solution:
Step 1: Create a list report to display the Months.
Step 2: Create the crosstab inside the list with master - detail set on Month and section on Month.
Step 3: Drag the Weekending Date data item and create the Weekday data item: _day_of_week([DTE],1) and use case statement to display Monday - Friday.
Step 4: Create the conditional variable to highlight the required data as below:
case when
(
string2date([SummaryQ2].[Week Ending Display]) < _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
)
or
(
string2date([SummaryQ2].[Week Ending Display]) = _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
and _day_of_week(date2timestamp(today()),1) > ([SummaryQ2].[Days])
)
then 'Orange'
when string2date ([SummaryQ2].[Week Ending Display]) = _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
and _day_of_week(date2timestamp(today()),1) < = ([SummaryQ2].[Days])
then 'Yellow'
else 'White' end
Step 5: Associate the crosstab intersection with the conditional variable and set the corresponding colors.
Requirement: Display for each month, for each weekending date the Revenue spread across the days of the week. Highlight the past weeks and past dates in Current week in Orange and current date and future dates in current week in Yellow and all other Future Dates in white.
Now the challenge is that the dates are not displayed rather the Day of Week. So the only items we have to work with are Weekending Date and the Day of Week.
Solution:
Step 1: Create a list report to display the Months.
Step 2: Create the crosstab inside the list with master - detail set on Month and section on Month.
Step 3: Drag the Weekending Date data item and create the Weekday data item: _day_of_week([DTE],1) and use case statement to display Monday - Friday.
Step 4: Create the conditional variable to highlight the required data as below:
case when
(
string2date([SummaryQ2].[Week Ending Display]) < _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
)
or
(
string2date([SummaryQ2].[Week Ending Display]) = _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
and _day_of_week(date2timestamp(today()),1) > ([SummaryQ2].[Days])
)
then 'Orange'
when string2date ([SummaryQ2].[Week Ending Display]) = _add_days(date2timestamp(today()),7 -_day_of_week(date2timestamp(today()),1))
and _day_of_week(date2timestamp(today()),1) < = ([SummaryQ2].[Days])
then 'Yellow'
else 'White' end
Step 5: Associate the crosstab intersection with the conditional variable and set the corresponding colors.
Monday, May 24, 2010
Setting Date Format on Chart X Axis
For those who didn't know, to set formats for Dates on a chart's X axis, change the Chart Node Member's source type from Member Caption to Data Item Value.
Friday, May 21, 2010
Count Distinct and Summaries
When you have data items with aggregate type set to sum and rollup aggregate set to sum and you include summary footer in your report the SQL Cognos generates would be something like:
select sum(Column 3) over(partition by column 1, column2), sum(column 3) over (partition by column 1).
But a count distinct over partition is not supported by the DBs and if your requirement is to display count distincts for the detail records as well as the summary footer then Cognos generates 2 SQLs like :
select T0.c1,
T0.c2,
T0.c3,
T1. c2
from
(
select
column 1 c1, column 2 c2,
count(distinct column 3) c3
group by 1,2
) T0,
(
select
column 1 c1
count(distinct column 3) c2
group by 1,2
) T1......
Now the disadvantage is if your T0 hits huge fact tables then you would be hitting those fact tables twice. The above scenario is at least a little acceptable but imagine a scenario wherein you have union clauses. I have a report that has 5 union clauses so the count distinct actually causes the 5 union queries to be executed twice which impacts performance.
Solution was to include 1 more union to get the summarized data with dummy data items thrown in to sort and get them to align after the detailed records and then conditionally format the report to make the summarized row appear as a footer.
select sum(Column 3) over(partition by column 1, column2), sum(column 3) over (partition by column 1).
But a count distinct over partition is not supported by the DBs and if your requirement is to display count distincts for the detail records as well as the summary footer then Cognos generates 2 SQLs like :
select T0.c1,
T0.c2,
T0.c3,
T1. c2
from
(
select
column 1 c1, column 2 c2,
count(distinct column 3) c3
group by 1,2
) T0,
(
select
column 1 c1
count(distinct column 3) c2
group by 1,2
) T1......
Now the disadvantage is if your T0 hits huge fact tables then you would be hitting those fact tables twice. The above scenario is at least a little acceptable but imagine a scenario wherein you have union clauses. I have a report that has 5 union clauses so the count distinct actually causes the 5 union queries to be executed twice which impacts performance.
Solution was to include 1 more union to get the summarized data with dummy data items thrown in to sort and get them to align after the detailed records and then conditionally format the report to make the summarized row appear as a footer.
Saturday, May 15, 2010
Cognos SDK Training
Been in Cognos SDK training for the past 3 days..It was great..Now I know what is and what isn't possible with SDK. A whole lot of times when you say something isn't possible you get a question whether it is possible with the SDK and I never could ever respond to that cause I didn't know whether it was or wasn't. So this training has now helped me answer that.
The thing about SDK is that whatever is possible with the Cognos tool is what is possible with the SDK. Basically you can use it a lot for automation like creating a screen for managing users, getting a list of Cognos objects / reports and so forth. I am still looking for real world examples wherein this can be used. If someone out there has been using the SDK you could probably share your experiences with us.
One great thing I got to know is what is called as Cognos Mashup Services or CMS in short which according to me is a hot thing. I will post more about this in another article.
And here's another tit-bit that I got to know - the next version of Cognos in all probability is going to be called Cognos 10 and would be released in the 4th Quarter of 2010.
The thing about SDK is that whatever is possible with the Cognos tool is what is possible with the SDK. Basically you can use it a lot for automation like creating a screen for managing users, getting a list of Cognos objects / reports and so forth. I am still looking for real world examples wherein this can be used. If someone out there has been using the SDK you could probably share your experiences with us.
One great thing I got to know is what is called as Cognos Mashup Services or CMS in short which according to me is a hot thing. I will post more about this in another article.
And here's another tit-bit that I got to know - the next version of Cognos in all probability is going to be called Cognos 10 and would be released in the 4th Quarter of 2010.
Monday, May 10, 2010
Burst Reports and Master Detail Relationships
Burst Reports with Master Detail relationships is not a good option to use in Cognos. This seems to be a major issue with Cognos and hope they rectify it soon. When you have reports that have master detail relationships and you need to burst the same, be aware that the report queries will be executed for each burst query group item.
The drawback is that you almost always end up with a report which requires a master detail relationship either because it has multiple pages with each page associated with a different query or some such reason. Its almost impossible to eliminate master detail relationships. And what you end up having because of this is a headache wherein either there is a huge number of queries fired against the DB one after the other or the scheduled process taking time due to the burst reports being executed one after the other.
What IBM needs to do is to have a property that report authors can set as to whether the detail query needs to be run for each master item or if the detail query needs to be executed once and then have it associated / sliced based on the master query.
The drawback is that you almost always end up with a report which requires a master detail relationship either because it has multiple pages with each page associated with a different query or some such reason. Its almost impossible to eliminate master detail relationships. And what you end up having because of this is a headache wherein either there is a huge number of queries fired against the DB one after the other or the scheduled process taking time due to the burst reports being executed one after the other.
What IBM needs to do is to have a property that report authors can set as to whether the detail query needs to be run for each master item or if the detail query needs to be executed once and then have it associated / sliced based on the master query.
Wednesday, May 5, 2010
Saving Reports to a Local Directory
This topic has been covered in Cognos Docs. But thought would explain in simple terms and include images for the benefit of others.
Step 1: In Cognos Configuration, Click on Actions menu. Select "Edit Global Configuration" in Actions menu. Choose General Tab.
Step 2: Enter file system path in "Archive Location File System Root" property. Example: file://\\c:\ReportOutput. This will serve as the root file system. You can create sub-folders inside this root folder to hold the report outputs for each individual project. Save the changes and re-start the Cognos Server.
Step 3: In Cognos Connection, Launch IBM Cognos Connection and select Configuration tab. Choose Dispatchers and Services option.
Step 4: Click on Define File System Locations icon. Click New. Provide a Name. In the file system location text box enter the sub-folder name. Example: Project 1 and click Finish.
And here is a contribution from RoadwarriorPB. A big thanks to RoadWarriorPB for sharing the below information -
RoadwarriorPB -
And here is a contribution from Nicolas. A big thanks to Nicolas for sharing the below information -
Nicolas -
Step 1: In Cognos Configuration, Click on Actions menu. Select "Edit Global Configuration" in Actions menu. Choose General Tab.
Step 2: Enter file system path in "Archive Location File System Root" property. Example: file://\\c:\ReportOutput. This will serve as the root file system. You can create sub-folders inside this root folder to hold the report outputs for each individual project. Save the changes and re-start the Cognos Server.
Step 3: In Cognos Connection, Launch IBM Cognos Connection and select Configuration tab. Choose Dispatchers and Services option.
Step 4: Click on Define File System Locations icon. Click New. Provide a Name. In the file system location text box enter the sub-folder name. Example: Project 1 and click Finish.
And here is a contribution from RoadwarriorPB. A big thanks to RoadWarriorPB for sharing the below information -
RoadwarriorPB -
In Step 5, you can define any number of File System locations. These definitions are nothing more than 'objects' in the Cognos Content Store, and can be secured like any other object. This can be very beneficial in a hosted, SaaS environment..you can define one 'location' per tenant,and secure the location for the Group associated with the tenant. Users will only 'see' the location they have rights to use.
The file system location is 'local' or relative to the Cognos Content Manager service, not the Report Dispatchers. If you have one or more Standby Content Managers, you will want to define network shares mapped to the same physical location for every Content Manager to insure that all Content Managers utilize the same physical file system location for output.
And here is a contribution from Nicolas. A big thanks to Nicolas for sharing the below information -
Nicolas -
This setting ("Save report outputs to a file system?") is used with the Content Manager advanced option "CM.OUTPUTFOLDER" which exports ALL executed reports to a designated folder (check the description provided by Cognos Configuration for that particular option). This was an old and rudimentary way for Cognos to handle output to file system, and later Cognos provided an Output Processor which handled this a little bit better.
Subscribe to:
Posts (Atom)