tag:blogger.com,1999:blog-84136382230260499902024-03-04T23:55:55.552-05:00Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.comBlogger73125tag:blogger.com,1999:blog-8413638223026049990.post-35447519968760123952023-09-26T11:43:00.001-04:002023-09-26T16:01:11.088-04:00Not BI, but...<p>Hello all! Yes, again it has been a while! </p><p>Life is sometimes a wondrous journey.</p><p>My latest obsession has been 8-bit computers! Building them from scratch.</p><p>Take a look at my <span style="color: red;"><a href="https://www.youtube.com/c/pony80" target="_blank">Z80 based computer</a></span>. This has been a lot of fun!</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_wuqGFsdcYho4v-Ny1VObwKewtKtGOznbwmtje4Kw3OU6TMndGMuyK1L4VhcqT2vn2Jq7uO44F-t006OB1caLNT57fICDkraNZP-z8v3rOO8pfVsNy3GEWiqe73QwMV6uqe45rDxrR3Rpd3YebYTvYhxMhcZqCa-PiHtMT4P6wqlkj2ocKf-2a65SNuA/s3185/Pony80a.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="3185" data-original-width="3024" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_wuqGFsdcYho4v-Ny1VObwKewtKtGOznbwmtje4Kw3OU6TMndGMuyK1L4VhcqT2vn2Jq7uO44F-t006OB1caLNT57fICDkraNZP-z8v3rOO8pfVsNy3GEWiqe73QwMV6uqe45rDxrR3Rpd3YebYTvYhxMhcZqCa-PiHtMT4P6wqlkj2ocKf-2a65SNuA/s320/Pony80a.jpg" width="304" /></a></div><br /><p><br /></p>Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-82974355365292371302019-02-08T16:35:00.002-05:002019-02-08T16:35:59.890-05:00PowerBI Report Server Change - January 2019Yes, it has been a while! Been busy with life.<br />
<br />
However, just fixed an issue on the new release of PowerBI Report Server - January 2019.<br />
<br />
We use training videos on the report server to show the users how to use the dashboards; however, after the January release, I was restricted from uploading the file due to the error:<br />
<br />
"Uploading or saving files with .mp4 extension is not allowed."<br />
<br />
After a bit of digging around, I found this entry in the ConfigurationInfo table in the ReportServer database: AllowedResourceExtensionsForUpload<br />
I simply added *.mp4 and now I am allowed to upload the videos again!<br />
<br />
Hope that helps<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-78555512605300774562017-05-12T14:21:00.003-04:002017-05-12T14:21:54.239-04:00SSIS Error: DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005 <br />
<br />
I was hit with this VERY generic error writing data to DB2/i in SSIS using MS OLEDB for DB2.<br />
After much searching and troubleshooting, I discovered that this error is thrown whenever you try to insert a NULL into a non nullable column.<br />
<br />
Hope this helps someone else save a few hours of work.<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-70641262523429278032015-01-08T13:32:00.002-05:002015-05-01T16:59:03.662-04:00SSRS Issue with Microsoft OLE DB for DB2 v3.0I ran into a perplexing problem on one of my SSRS servers. This server handled the reporting for internal users, and had a lot of reports that read from our source system, on iSeries DB2 (V6R1), and some reporting from our SQL Server data mart.<br />
<br />
The issue was that at random times, SSRS would fully utilize one processor, and a lot of memory.<br />
<br />
When looking at the active jobs, it was never the same report, so we tried to find the common thread in each report that was killing the server. We found nothing that was able to stop this from happening. I adjusted the maximum memory setting in SSRS, trying to keep the server from using all memory and causing a restart of the SSRS service. This had no effect as well.<br />
<br />
Finally, one of my team members stumbled across the commonality in all of the offending reports: They all used DB2 as the data source.<br />
<br />
Well, the first thing I thought of was the Microsoft OLE DB for DB2. We had v3.0 installed on this server. It has been working fine for over two years, but there are regular OS updates to both the iSeries and the SSRS server, one of which could have started this issue.<br />
<br />
So I upgraded this server to v4.0, tested the reports to make sure we didn't break anything, and turned the users loose on it.<br />
<br />
After two days I can call this a success! The server rarely gets above 10% processor, and very little memory is being used. Reports are faster, and the users are happier!<br />
<br />
Update: The server is running MUCH better, but we have run into an issue with some of the queries causing DB2 errors because the parameters are defined as Text, however, the database field is numeric. I am still looking for a global way of fixing this, hopefully with the connection string to the DB2.<br />
<br />
Update #2: After running v4.0 for 4 months, the problems have definitely gone away. The only two side effects of this upgrade are:<br />
<ul>
<li>A Text type parameter in an SSRS report will throw an error if used on a numeric data type on the DB2. To rectify this, I just did a cast on the numeric field when comparing to the text parameter.</li>
<li>In v3, field names were limited to a size smaller than on DB2. If longer field names were used for reports, v3 just truncated the field name and used the limit. v4.0 did away with this limitation, so field names were their full size, causing field name issues. Example:</li>
<ul>
<li>AReallyLongFieldNameForTesting on DB2 might show up in v3.0 as AReallyLongFieldNameFo. Once you upgrade to v4.0, the full field name comes in, causing a name mismatch in your reports.</li>
</ul>
</ul>
<br />peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-47582368426165770452013-09-05T09:12:00.001-04:002013-09-05T09:12:25.182-04:00Limitation with YEAR() function in SSISI learned something new today while researching an ETL failure:<br />
The YEAR() function in SSIS has a lower limit of 1-1-1753.<br />
<br />
Even though the DT_DATE and DT_DBDATE support dates earlier than 1-1-1753, the YEAR() function does not and will fail.<br />
<br />
For validation in a derived column transformation, I was using this logic:<br />
YEAR(data_field) < 1900? (DT_DBDATE)"1-1-1900":data_field<br />
<br />
And when the date 8-23-0203 came thru, the package failed. The error was non-descript:<br />
<blockquote class="tr_bq">
<span><span><span id="ctl31_ReportViewer"></span></span></span><table border="0" cellpadding="0" cellspacing="0" class="Ae3e6ec19d81d418c8d143485f981336b187" cols="5"><tbody>
<tr valign="top"><td class="Ae3e6ec19d81d418c8d143485f981336b175cr"><br /></td><td class="Ae3e6ec19d81d418c8d143485f981336b179cl"><div class="Ae3e6ec19d81d418c8d143485f981336b179" style="color: red; white-space: pre-wrap; word-wrap: break-word;">
An error occurred while evaluating the function.</div>
</td></tr>
</tbody></table>
</blockquote>
<br />
So, be aware...<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-67029496175904417992013-03-20T11:47:00.001-04:002013-03-20T11:47:36.286-04:00SSIS and Max Packet SizeWorking to improve my ETL performance, I changed my Max Packet Size on my database server and my ETL server to 32767. However, when I tried to run an SSIS package, I received the following error:<br />
<blockquote class="tr_bq">
Communication Link Error.<br />Shared Memory Provider: No process is on the other end of the pipe.</blockquote>
I also received this error when in SQL Server Management Studio, connected to the Integration Services on that server, and tried to browse thru the package store to a package. <br />
After a little research, I determined that SSIS cannot read the packages from the MSDB database if the the Max Packet Size is set larger than 16384.<br />
<br />
So at this point, my options are to lower the Max Packet Size on the ETL server, Save my SSIS packages on the hard drive, or move the packages to a package store on a different server, with the Max Packet Size at 16384 or lower.<br />
<br />
I am going to test this out and see if it warrants moving the packages to either the file system or another server, or if the 16384 packet size will be sufficient for our ETL processing.<br />
<br />
<br />
peace <br />
Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-90015062886311633722013-03-13T11:28:00.000-04:002013-03-20T11:48:46.226-04:00Rebuilding All Indexes on a TableSometimes you have a need to rebuild all indexes on a table. This could be because of an ETL process, or other data loading process that renders the indexes out of date.<br />
I recently experienced this issue on during ETL processing.<br />
<br />
I could have scripted out all of the index rebuilds for the table and then added to an Execute SQL task within SSIS. However, if any index should be deleted or renamed, then the Execute SQL task would fail, and the ETL would stop. Also, if we added new indexes, they would not get rebuilt until we added the index to the the Execute SQL task.<br />
<br />
So I wrote a stored procedure to rebuild all of the indexes on the a table, or, just one index on a table.<br />
<br />
I wanted to share that procedure with you, since I think this may be a common scenario.<br />
<blockquote class="tr_bq">
<br />
-- =============================================<br />
-- spwRebuildTableIndexes<br />
--<br />
-- Author: Bob Pearson<br />
-- Create date: 2013-03-14<br />
-- Description: Rebuilds specified index or all indexes on a table<br />
--<br />
-- Updates<br />
-- Date Who What<br />
-- -------- ---- ----------------------------<br />
--<br />
-- =============================================<br />
ALTER PROCEDURE [dbo].[spwRebuildTableIndexes]<br />
@TableName varchar(100),<br />
@IndexName varchar(100) = null<br />
<br />
AS<br />
BEGIN<br />
<br />
Set NOCOUNT ON;<br />
<br />
Declare @SQL varchar(max) = '';<br />
<br />
if @IndexName is not null<br />
BEGIN<br />
Select @SQL = 'Alter Index ' + @IndexName + ' on ' + @TableName + ' REBUILD;';<br />
Execute (@SQL);<br />
print @TableName + '.' + @IndexName + ' Rebuilt...'<br />
Return;<br />
END<br />
<br />
Declare cur cursor for<br />
<br />
Select i.name <br />
from sys.indexes i <br />
inner join <br />
sys.objects o <br />
on i.object_id = o.object_id<br />
where o.name = @TableName and o.type = 'U'<br />
<br />
for read only;<br />
<br />
Open Cur;<br />
<br />
Fetch Next from Cur into @IndexName;<br />
<br />
While @@FETCH_STATUS = 0<br />
BEGIN <br />
Select @SQL = 'Alter Index ' + @IndexName + ' on ' + @TableName + ' REBUILD;';<br />
Execute (@SQL);<br />
<br />
print @TableName + '.' + @IndexName + ' Rebuilt...'<br />
<br />
Fetch Next from Cur into @IndexName;<br />
END<br />
<br />
Close Cur;<br />
Deallocate Cur;<br />
END</blockquote>
peace<br />
<br />Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-13843814620385190212013-02-27T13:26:00.000-05:002013-02-28T09:51:35.590-05:00Quick T-SQL Tip #2: Identity Columns Seed ValueThis post really deals with high volume tables with an identity column, but the idea is something that everyone should at least be familiar with.<br />
<br />
Quite often I see table creation scripts that start like this:<br />
<br />
<blockquote class="tr_bq">
Create Table ATable<br />
(<br />
ID int not null identity(1,1)<br />
...</blockquote>
<br />
And while that is perfectly legal and will work with no issues, if this table is going to be a high volume table, you should consider using the entire Int, not just half of one.<br />
<br />
<i>What??</i><br />
<br />
The Int data type spans -2,147,483,648 thru 2,147,483,647. If your seed is 1, you lose 2,147,483,649 values that could be used.<br />
<br />
I have seen production situations where they have run out of identity values because the seed was 1.<br />
So, next time you create an Identity, ask yourself: "Two billion, or four billion?"<br />
<br />
and then <br />
<br />
<blockquote class="tr_bq">
Create Table ATable<br />
(<br />
ID int not null identity(-2147483648,1)<br />
...</blockquote>
<br />
peace <br />
<br />Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-13720555346576011472013-02-04T13:30:00.000-05:002013-02-04T13:37:17.462-05:00Quick T-SQL Tip #1 - Formatting SQLI am going to start a series of posts for quick T-SQL tips. In my current position, I support a group of business report developers that are not too familiar with T-SQL. This will help them learn some of the tips and tricks that I use frequently.<br />
<br />
So, tip #1, Formatting SQL.<br />
<br />
This tip does not just apply to T-SQL, but any kind of code in general.<br />
I am sure most other developers have been here: You inherit some code and the first hour is spent trying to figure out what it's doing!<br />
<br />
Well, here is my basic formatting. My goal is to let anyone look at it and understand it quickly!<br />
<br />
<br />
Select <i>tab</i> Field1 As Field1Name,<br />
<i>tab tab</i> Field2 as Field2Name<br />
From <i>tab</i> Table1 T1<br />
Inner join<br />
<i>tab tab</i> Table2 T2<br />
on <i>tab tab</i> t1.Field = T2.Field<br />
Where <i>tab</i> T1.Field = 'AA'<br />
Order by<br />
<i>tab tab</i> T2.Field1 ASC<br />
<br />
Doesn't look that pretty there, but in SQL Management Studio, it works like a charm! :)<br />
<br />
My goal is to have enough whitespace that the SQL does not look cluttered.<br />
No matter how you format, always keep the next developer in mind!!<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-65328518642820106662012-10-08T17:39:00.000-04:002013-02-04T13:35:12.212-05:00Error Calling Stored Procedure with DATE and TIME datatypes from SSISWhen calling a stored procedure using an OLE DB Command from SSIS to a stored procedure that has a DATE or TIME parameter, you will get the following error:<br />
<br />
<blockquote class="tr_bq">
An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Operand type clash: int is incompatible with date"</blockquote>
The quick fix to this is to change the parameter type in the stored procedure to a varchar(50).<br />
Some blogs recommend changing it to DATETIME, but the range of DATETIME is much smaller than DATE and will truncate.<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-34258509301927951132012-10-03T11:42:00.002-04:002013-02-04T13:33:50.474-05:00SQL Server Task StatusHave you ever been performing a DBCC SHRINKFILE and wondered how long it was going to take? The Backup/Restore screen shows percent complete, how about DBCC CHECKDB?<br />
<br />
Here's a neat query that can help with that:<br />
<blockquote class="tr_bq">
Select * from sys.dm_exec_requests</blockquote>
This query returns information about every request that the SQL Server is executing. <br />
<br />
One of the columns returned is Percent_Complete. However, this column is only populated for the following commands:<br />
<blockquote class="tr_bq">
ALTER INDEX REORGANIZE<br />
AUTO_SHRINK option with ALTER DATABASE<br />
BACKUP DATABASE<br />
DBCC CHECKDB<br />
DBCC CHECKFILEGROUP<br />
DBCC CHECKTABLE<br />
DBCC INDEXDEFRAG<br />
DBCC SHRINKDATABASE<br />
DBCC SHRINKFILE<br />
RECOVERY<br />
RESTORE DATABASE<br />
ROLLBACK<br />
TDE ENCRYPTION</blockquote>
I have four large databases that I have to shrink when they are copied to the development. These can take quite a while, so I use this query to monitor status.<br />
<br />
Check out the <a href="http://msdn.microsoft.com/en-us/library/ms177648%28v=sql.105%29.aspx" target="_blank">documentation</a> for more details!<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-3427810540235657952012-09-20T16:24:00.000-04:002013-02-04T13:33:08.148-05:00Performance Tuning and CacheWhen tuning queries and indexes for performance, it's a good idea to clear the cache when running benchmarks.<br />
For example, when you first run a query, a plan is created, and then data is fetched from the drive.<br />
On the next execute, your query plan won't need to be created, and the results will probably come from the cache, giving a false indication of how long the query will run.<br />
<br />
So to clear the cache, execute these two statements:<br />
<br />
<blockquote class="tr_bq">
<div class="sql" name="code">
<span style="font-size: x-small;"><a href="http://msdn.microsoft.com/en-us/library/ms187762%28v=sql.105%29.aspx" target="_blank">DBCC DROPCLEANBUFFERS</a></span></div>
<div class="sql" name="code">
<span style="font-size: x-small;"><span id="goog_446892721"></span><a href="http://msdn.microsoft.com/en-us/library/ms174283%28v=sql.105%29.aspx" target="_blank">DBCC FREEPROCCACHE</a><span id="goog_446892722"></span></span></div>
</blockquote>
<div class="sql" name="code">
<br /></div>
<div class="sql" name="code">
Then you can test with each execution being treated equally.</div>
<div class="sql" name="code">
<br /></div>
<div class="sql" name="code">
Take care on the FreeProcCache, as this is a server level command. Do not run this on a production box unless you really know what you are doing!</div>
<div class="sql" name="code">
<br /></div>
<div class="sql" name="code">
<br />
peace</div>
Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-49474136920923045782012-08-28T13:56:00.001-04:002013-02-04T13:31:50.073-05:00SQL Server 2008 R2 and HyperthreadingSo I have read about the debate about hyperthreading on a SQL Server or not.<br />
<br />
Well, I decided to put it to the test.<br />
<br />
I have an ETL process that runs about 8 hours, moving data from a DB2 source, to a staging area on SQL Server.<br />
Then it is moved into an operational data store, and then finally into a data mart.<br />
<br />
The system has four 6-core Xeon processors.<br />
<br />
I turned off hyperthreading, and the ETL process decreased by more than an hour!<br />
<br />
So there it is.. My totally unscientific, unique results!<br />
<br />
Your mileage may vary, but at least give it a try on a Xeon processor.<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com1tag:blogger.com,1999:blog-8413638223026049990.post-18583466189384990352012-08-08T12:13:00.000-04:002012-08-08T12:13:39.723-04:00SSRS Reporting in Java<br />
<div class="MsoNormal">
So we are in the final testing stages of our SSRS reporting
interface in a Java web front-end.</div>
<div class="MsoNormal">
The results are really fantastic! However, going thru this
process has taught me a lot about how SSRS works internally, things that I took
for granted when using report manager to render reports. These things will
probably help my report design in the future, so I thought I would write them
down and pay it forward!</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
First, a bit about how we implemented this:</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
In our reports, we force exact pagination by having a
caption at the top of the report that reads:<span style="mso-spacerun: yes;">
</span>This report contains xxx pages.<span style="mso-spacerun: yes;">
</span>Since the report wants this number, SSRS will go ahead and process the
report to calculate the page count.<span style="mso-spacerun: yes;"> </span>If
we had left this out, SSRS would render the first page and not worry about how
many subsequent pages were coming.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
In the front end, we render page 1 of the report, get the
total page count, and then display the first page in a viewer similar to report
manager. Note that we are NOT using the report viewer control since this is a
Java application. We render all pages to HTML4.0 and display as is on the web
page. Since SSRS embeds all of the formatting, the front end is responsible for
none of that.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
When the user clicks to go to the next page, we again call
render with the page number in the device info XML.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
In parallel, I created a small C# app that calls a report
and renders it much the same way.<span style="mso-spacerun: yes;"> </span>I did
this so I could make sure I fully understood the process and could assist the
Java developers.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
We ran into issues when our reports took much longer to
render in our front end than it did in report manager or my C# app.<span style="mso-spacerun: yes;"> </span>So I started watching stats using the
ExecutionLog3 view on the report database.<span style="mso-spacerun: yes;">
</span>I paid special attention to TimeDataRetrieval, TimeProcessing, and
TimeRendering. </div>
<div class="MsoNormal">
From my app I was seeing the data retrieval occurring on the
first render, and some time processing and then some time rendering.<span style="mso-spacerun: yes;"> </span>These numbers were all pretty much in-line
with what I would have expected. For a 110 page report, I had 3000-3500ms for
data, 750-1000ms for processing, and 1000-1500 for rendering. On this first
page, the Source field was Live.</div>
<div class="MsoNormal">
On moving to the next page though, the Source field was
Session, I was seeing no data retrieval at all, and minimal processing and
rendering: 0ms for data, 35-40ms for processing, and 100ms for rendering. Very
acceptable numbers for a reporting platform! So this tells me that SSRS
processes the dataset all at once, does a bit of pre-rendering to get the page
count, and fully renders the first page.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
From our front end however, I saw something completely different!
Every request had a Source of Live, and each request was pulling data and
rendering, sometimes taking up to 15-20 seconds to run the report.</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
I put WireShark on the SSRS server and started watching network
packets. I was able to watch every command that came into the SSRS server, and
the response from the server. <span style="mso-spacerun: yes;"> </span>It quickly
became apparent what was happening. <span style="mso-spacerun: yes;"> </span>The
front end was not sending the page number for the first Render call, and then calling
SetExecutionParameters between each page. The first Render call was causing the
server to fully render the report for the first page. The
SetExecutionParameters call was invalidating the session and causing the report
to be full re-rendered, including data retrieval, for each page. Obviously this
was the cause behind the huge difference in execution times!</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
So I worked with the Java developer and was able to remove
any extraneous SSRS calls, and now the performance is amazing!</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
Our report template it pretty complex, we have a lot of
background features that require processing, so to have execution times in the
low 100ms range is great!</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
Speaking of our report template, we have a parameter called
Date Range.<span style="mso-spacerun: yes;"> </span>A user can select a value in
this drop down and select ranges such as last month, last year, etc. When the
user selects one of these, we call the SQL Server to return a start and end
date based on the selection. In report manager, once you select one date range,
and then try to select a second, the start and end dates will not update.<span style="mso-spacerun: yes;"> </span>This is a “feature” according to Microsoft.
Since we wrote our own front end, we were able to work around this. So you can
select Last 2 Years and the start and end dates will populate, and then you can
change to rolling 13 months and the start and end dates will update
accordingly!</div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
That’s all for now… I welcome any and all questions… </div>
<div class="MsoNormal">
<br /></div>
<div class="MsoNormal">
peace</div>Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-24373811995975277342012-07-16T14:33:00.001-04:002012-07-16T14:35:58.179-04:00SSRS Create Subscription Issue Using Web ServicesOur implementation of SSRS into our Java web site is going
very well! However, we ran into an issue trying to set up subscriptions thru
the web services.<br />
<br />
When trying to set the message body text (the Comment
field), we got the error:<br />
<blockquote class="tr_bq">
One of
the extension parameters is not valid for the following reason: you do not have
permission to modify the value for the "Comment" </blockquote>
The users all have the browser role, which allows Manage
Individual Subscriptions, but this is not enough. There is a restriction
in that setting that disallows the user from setting the comment field.<br />
What I had to do was give the browser role the ability to
Manage All Subscriptions. The major caveat with this setting is that if the
application calls ListSubscriptions(null, null), the user will see all
subscriptions in the system. We had to
setup code on the front end to ALWAYS pass the UserID to the ListSubscription
method. (<a href="http://msdn.microsoft.com/en-us/library/reportservice2005.reportingservice2005.listsubscriptions">http://msdn.microsoft.com/en-us/library/reportservice2005.reportingservice2005.listsubscriptions</a>)<br />
<br />
Also, note that for the My Reports folder, the browser security role is not used, its the My Reports role. You will need to give the My Reports role the Manage All Subscriptions task as well.<br />
<br />
peace<br />
BobPBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-71600061964862838522012-06-26T11:20:00.001-04:002012-06-26T14:09:11.759-04:00SSRS Subscriptions and Blank LabelsRan into an interesting issue with SSRS subscriptions. Actually, I am very surprised I have not run into this yet.<br />
<br />
I am running SQL Server 2008r2 SP1.<br />
<br />
If you have a report with a multi select parameter that allows blanks, a blank label will either prevent you from setting up the subscription, or cause the subscription to fail when it is run.<br />
<br />
The Javascript error is:<br />
<blockquote class="tr_bq">
Message: 'firstChild' is null or not an object<br />
Line: 640<br />
Char: 13<br />
Code: 0<br />
URI: http://Server/InternalReporting/js/ReportingServices.js </blockquote>
and the subscription error is:<br />
<blockquote class="tr_bq">
Failure sending mail: Default value or value provided for the
report parameter 'ReportParameter1' is not a valid value. Mail will not
be resent.
</blockquote>
<br />
I have opened an issue on Connect: <a href="http://connect.microsoft.com/SQLServer/feedback/details/750927/blank-parameter-labels-cause-error-in-subscriptions">http://connect.microsoft.com/SQLServer/feedback/details/750927/blank-parameter-labels-cause-error-in-subscriptions</a><br />
<br />
What I have done in the meantime is to replace any blank labels with a holder character ("-").<br />
<br />
Has anyone else ran into this issue?<br />
<br />
BobPBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-73493508052239376632012-05-07T15:18:00.000-04:002012-06-26T11:21:46.442-04:00SSRS Parameters are Forced UnicodeWhile doing some performance tuning on a report query today, I ran into the issue of SSRS sending all strings to a SQL Server database as unicode. Here is a snippet of the query I was working with: <br />
<blockquote>
or tbl.Field in (N''0100'',N''1002'',N''1201'',N''1202''....</blockquote>
This snippet came right out of profiler with no editing... <br />
<br />
The N in front of each string forces SQL Server to treat it as unicode, as an NVARCHAR string. The issue with this is that my tbl.Field column is a varchar. So no indexes would ever be used for this query, it was always a clustered index scan. <br />
<br />
My workaround was to add an nvarchar column, using a trigger to popuate it with the nvarchar version of the varchar field. Messy, but once I did that, and indexed it, the query performed fine. <br />
<br />
I have opened up a Connect suggestion, so please visit that and vote for it. This situation should not exist.<br />
<br />
<a href="https://connect.microsoft.com/SQLServer/feedback/details/740975/ssrs-sends-all-strings-to-sql-server-as-unicode">https://connect.microsoft.com/SQLServer/feedback/details/740975/ssrs-sends-all-strings-to-sql-server-as-unicode</a> <br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-91090320186688486462012-04-16T16:25:00.000-04:002012-08-08T14:55:30.563-04:00SSRS Reporting in a Java Web SiteFor the last 6 months or so, I have been working on a project with our web application development team to integrate SSRS reporting into our client facing Java web site. <br />
<br />
When I first proposed this project, I knew that in theory this should work fine. But, it did take a bit of research and a lot of work to bring it to fruition.<br />
<br />
We are moving into the testing phase this week, and I am happy to say that it works very well! The Java developer working on this is doing a great job of translating all of my Microsoft talk to java :)<br />
<br />
We are using the Web Services provided by SSRS with a custom security extension that authenticates against our web site security.<br />
<br />
Since we are only rendering one page at a time, rendering is pretty quick, taking less than a second a page to render.<br />
<br />
Once we go thru testing and release, I am going to write more on this, including some details.Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com8tag:blogger.com,1999:blog-8413638223026049990.post-27133756552440216632012-04-11T14:28:00.000-04:002012-04-11T14:28:01.355-04:00Retrieve Table List in DB2/400 via SQL StatementAgain, mainly to remind myself, the query to retrieve a list of tables in DB2/400 is:<br />
<br />
<blockquote>select * from qsys2.systables where table_type='T';</blockquote><br />
If you want a list of views, table_type would = 'V'<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-68652918982169914772012-03-15T16:27:00.000-04:002012-03-15T16:27:22.494-04:00Duration Change in SQL Profiler 2005<blockquote>Beginning with SQL Server 2005, the server reports the duration of an event in microseconds (one millionth, or 10-6, of a second) and the amount of CPU time used by the event in milliseconds (one thousandth, or 10-3, of a second). In SQL Server 2000, the server reported both duration and CPU time in milliseconds. In SQL Server 2005 and later, the SQL Server Profiler graphical user interface displays the Duration column in milliseconds by default, but when a trace is saved to either a file or a database table, the Duration column value is written in microseconds. - Microsoft Books Online<br />
</blockquote><br />
This is just a helpful tip so you don't waste time following a non-issue... :-)<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com2tag:blogger.com,1999:blog-8413638223026049990.post-76473794757054918702011-12-02T11:14:00.000-05:002011-12-02T11:14:25.726-05:00Multi Row Inserts in T-SQLA new feature in SQL Server 2008 is the ability to insert multiple rows with one insert statement!<br />
<br />
In previous versions, anytime you wanted to insert multiple rows into a table with a T-SQL script, you would have done something like this:<br />
<br />
<blockquote>Insert into MyTable (Field1, Field2)<br />
Values ('ABC',123)<br />
Insert into MyTable (Field1, Field2)<br />
Values ('DEF',456)<br />
</blockquote>However, with SQL Server 2008, you can now do this:<br />
<br />
<blockquote>Insert into MyTable (Field1, Field2)<br />
Values ('ABC',123),('DEF',456)</blockquote><br />
Working with a growing data mart, I do a lot of scripted inserts, so this definitely makes my life a little simpler!<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-72820735879923928812011-10-18T13:38:00.000-04:002011-10-18T13:38:36.871-04:00Unattended Visual Source Safe 2005 installationTo install Visual Source Safe 2005 silently, you must first create an install ini file using Setup.exe.<br />
The setup in the root of the CD will not do this, you have to navigate to the setup folder on the CD.<br />
If you are using a download from MSDN, there is only one setup.exe and it’s the one to use.<br />
The machine that you use to create this ini will be the template for all installations. You cannot create the ini file from a machine that already has VSS installed. You must create it from a machine that does not yet have VSS installed.<br />
<br />
From a command line, execute the following: <blockquote>\\(servername)\(sharename)\(VSS Path)\setup.exe /CreateUnattend \\(servername)\(sharename)\(Path to INI)\filename.ini</blockquote><br />
I create my ini file in the same folder as the setup process.<br />
For example, my command would look like this:<br />
<blockquote>\\VSSServer\VSS\setup.exe /CreateUnattend \\VSSServer\VSS\FullInstall.ini</blockquote><br />
Again, the machine that you are running this process on should not have VSS installed yet. If it does have VSS installed, the ini file will be used to upgrade or add/remove features and will not install VSS on the target machine.<br />
<br />
To install on the target machine, run this command:<br />
<br />
<blockquote>\\(servername)\(sharename)\(VSS Path)\setup.exe /UnattendFile \\(servername)\(sharename)\(Path to INI)\filename.ini</blockquote>Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-14310066634049634282011-09-13T10:16:00.003-04:002012-04-16T15:57:38.646-04:00Repeat Table Header on Each Page in SSRS 2008R2I ran into an issue with the tablix in SSRS 2008R2 where I set the tablix property “RepeatColumnHeaders” to True. However, this did not cause the table header to repeat.<br />
<br />
After a little experimentation, and a little help from <a href="https://sqlserverbiblog.wordpress.com/" target="blank">Paul Turley</a>, I figured out how to consistently get the table header to repeat on every page.<br />
<br />
The setting of the “RepeatColumnHeaders” property on the tablix seems to have no effect on this behavior. Instead, we have to set it on a static member of the row groups.<br />
<br />
To see the static members of the row groups, click on the small drop down arrow on the far right of the row groups/columns groups header. Then select Advanced Mode.<br />
<br />
With the static members of the row group shown, select the first one in the list. The top left cell of the tablix should be selected in the designer.<br />
<br />
Now open the property pane and find the property “RepeatOnNewPage” and set it to True.<br />
<br />
Then find the property “KeepWithGroup” and make sure its set to After. If its not, set it to After.<br />
<br />
Now your table header will repeat on every page.<br />
<br />
BTW, check out Paul’s book “Microsoft SQL Server Reporting Services Recipes: for Designing Expert Reports” It has been a good resource for SSRS.<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com0tag:blogger.com,1999:blog-8413638223026049990.post-71972181482287104532010-09-23T15:29:00.000-04:002010-09-23T15:29:27.234-04:00SSIS 2008 / SQL Server 2008 Connectivity IssueWe recently upgraded to SSIS 2008 and SQL Server 2008 r2. This helped the performance of a large package that contained 27 lookups for a fact table. It used an OLEDB destination component, set to Table – Fast Load. It would process a million rows in less than 2 minutes.<br />
<br />
This worked well, until I added a multicast and an OLE Command component to the data flow. What I wanted was to check the values passed to the stored procedure and add the existing row to history if there was a change. The OLE Command used the same connection manager as the main data insert, since the history and the current tables were in the same database.<br />
<br />
When running it, I would get to about 100,000 rows, and then SSIS would begin to slow down and finally it would lose network connectivity to the SQL Server. (We have a separate SSIS server) This is the error that was written to the logs:<br />
<br />
<blockquote>SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred.<br />
Error code: 0x80004005.<br />
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" <br />
Hresult: 0x80004005 Description: "A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.".<br />
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" <br />
Hresult: 0x80004005 Description: "Communication link failure".<br />
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" <br />
Hresult: 0x80004005 Description: "Named Pipes Provider: No process is on the other end of the pipe."</blockquote><br />
I began by troubleshooting the network, looking for errors on the switch and both servers. This yielded nothing, there were no errors logged. So I turned to SSIS, figuring that it somehow was the culprit since everything worked until I added the multicast.<br />
<br />
I removed the multicast and the OLE Command components and tested again. The package processed a million rows in less than 2 minutes.<br />
<br />
I added the multicast back in, without the OLE Command and tested. Again it processed a million rows in less than 2 minutes.<br />
<br />
I added the OLE Command component back in and tested. It failed at 49,000 records.<br />
<br />
I added another connection manager, set the same properties and then pointed the OLE Command to the new manager. When I tested, I was surprised to see it complete successfully! <br />
<br />
It seems that the one connection manager was not able to handle the multiple destinations simultaneously. I have no way to confirm this at this time, but since I have multiple connection managers for the same database, I have not experienced this issue at all.Bob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com1tag:blogger.com,1999:blog-8413638223026049990.post-41116865275016328892010-08-04T12:43:00.000-04:002010-08-04T12:43:36.675-04:00SSRS Fix for FireFox/Chrome Report SizeIf you have ever used Chrome or FireFox with SSRS, you know the report window can show up very small. This is due to the way that SSRS implements the IFRAME. <br />
<br />
A quick fix for this is to add the following entry to the ReportingServices.css file.<br />
<blockquote>.DocMapAndReportFrame<br />
{<br />
min-height: 800px;<br />
min-width: 800px;<br />
}<br />
</blockquote>You can usually find the ReportingServices.css file at: C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportManager\Styles, however your location may be different depending on your installation.<br />
<br />
peaceBob Pearsonhttp://www.blogger.com/profile/16528166535649336177noreply@blogger.com7