Friday, February 8, 2019

PowerBI Report Server Change - January 2019

Yes, it has been a while!  Been busy with life.

However, just fixed an issue on the new release of PowerBI Report Server - January 2019.

We use training videos on the report server to show the users how to use the dashboards; however, after the January release, I was restricted from uploading the file due to the error:

"Uploading or saving files with .mp4 extension is not allowed."

After a bit of digging around, I found this entry in the ConfigurationInfo table in the ReportServer database:  AllowedResourceExtensionsForUpload
I simply added *.mp4 and now I am allowed to upload the videos again!

Hope that helps


Friday, May 12, 2017

SSIS Error: DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005

DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005

I was hit with this VERY generic error writing data to DB2/i in SSIS using MS OLEDB for DB2.
After much searching and troubleshooting, I discovered that this error is thrown whenever you try to insert a NULL into a non nullable column.

Hope this helps someone else save a few hours of work.


Thursday, January 8, 2015

SSRS Issue with Microsoft OLE DB for DB2 v3.0

I ran into a perplexing problem on one of my SSRS servers.  This server handled the reporting for internal users, and had a lot of reports that read from our source system, on iSeries DB2 (V6R1), and some reporting from our SQL Server data mart.

The issue was that at random times, SSRS would fully utilize one processor, and a lot of memory.

When looking at the active jobs, it was never the same report, so we tried to find the common thread in each report that was killing the server. We found nothing that was able to stop this from happening. I adjusted the maximum memory setting in SSRS, trying to keep the server from using all memory and causing a restart of the SSRS service. This had no effect as well.

Finally, one of my team members stumbled across the commonality in all of the offending reports: They all used DB2 as the data source.

Well, the first thing I thought of was the Microsoft OLE DB for DB2. We had v3.0 installed on this server. It has been working fine for over two years, but there are regular OS updates to both the iSeries and the SSRS server, one of which could have started this issue.

So I upgraded this server to v4.0, tested the reports to make sure we didn't break anything, and turned the users loose on it.

After two days I can call this a success! The server rarely gets above 10% processor, and very little memory is being used. Reports are faster, and the users are happier!

Update:  The server is running MUCH better, but we have run into an issue with some of the queries causing DB2 errors because the parameters are defined as Text, however, the database field is numeric.  I am still looking for a global way of fixing this, hopefully with the connection string to the DB2.

Update #2: After running v4.0 for 4 months, the problems have definitely gone away.  The only two side effects of this upgrade are:
  • A Text type parameter in an SSRS report will throw an error if used on a numeric data type on the DB2. To rectify this, I just did a cast on the numeric field when comparing to the text parameter.
  • In v3, field names were limited to a size smaller than on DB2.  If longer field names were used for reports, v3 just truncated the field name and used the limit. v4.0 did away with this limitation, so field names were their full size, causing field name issues. Example:
    • AReallyLongFieldNameForTesting on DB2 might show up in v3.0 as AReallyLongFieldNameFo. Once you upgrade to v4.0, the full field name comes in, causing a name mismatch in your reports.


Thursday, September 5, 2013

Limitation with YEAR() function in SSIS

I learned something new today while researching an ETL failure:
The YEAR() function in SSIS has a lower limit of 1-1-1753.

Even though the DT_DATE and DT_DBDATE support dates earlier than 1-1-1753, the YEAR() function does not and will fail.

For validation in a derived column transformation, I was using this logic:
YEAR(data_field) < 1900? (DT_DBDATE)"1-1-1900":data_field

And when the date 8-23-0203 came thru, the package failed. The error was non-descript:

An error occurred while evaluating the function.

So, be aware...


Wednesday, March 20, 2013

SSIS and Max Packet Size

Working to improve my ETL performance, I changed my Max Packet Size on my database server and my ETL server to 32767. However, when I tried to run an SSIS package, I received the following error:
Communication Link Error.
Shared Memory Provider: No process is on the other end of the pipe.
I also received this error when in SQL Server Management Studio, connected to the Integration Services on that server, and tried to browse thru the package store to a package.
After a little research, I determined that SSIS cannot read the packages from the MSDB database if the the Max Packet Size is set larger than 16384.

So at this point, my options are to lower the Max Packet Size on the ETL server, Save my SSIS packages on the hard drive, or move the packages to a package store on a different server, with the Max Packet Size at 16384 or lower.

I am going to test this out and see if it warrants moving the packages to either the file system or another server, or if the 16384 packet size will be sufficient for our ETL processing.


Wednesday, March 13, 2013

Rebuilding All Indexes on a Table

Sometimes you have a need to rebuild all indexes on a table.  This could be because of an ETL process, or other data loading process that renders the indexes out of date.
I recently experienced this issue on during ETL processing.

I could have scripted out all of the index rebuilds for the table and then added to an Execute SQL task within SSIS. However, if any index should be deleted or renamed, then the Execute SQL task would fail, and the ETL would stop. Also, if we added new indexes, they would not get rebuilt until we added the index to the the Execute SQL task.

So I wrote a stored procedure to rebuild all of the indexes on the a table, or, just one index on a table.

I wanted to share that procedure with you, since I think this may be a common scenario.

-- =============================================
-- spwRebuildTableIndexes
-- Author:        Bob Pearson
-- Create date: 2013-03-14
-- Description:    Rebuilds specified index or all indexes on a table
-- Updates
-- Date            Who        What
-- --------        ----    ----------------------------
-- =============================================
ALTER PROCEDURE [dbo].[spwRebuildTableIndexes]
    @TableName varchar(100),
    @IndexName varchar(100) = null



    Declare @SQL varchar(max) = '';
    if @IndexName is not null
        Select @SQL = 'Alter Index ' + @IndexName + ' on ' + @TableName + ' REBUILD;';
        Execute (@SQL);
        print @TableName + '.' + @IndexName + ' Rebuilt...'

    Declare cur cursor for

        from    sys.indexes i
        inner join
                sys.objects o
        on        i.object_id = o.object_id
        where = @TableName and o.type = 'U'

    for read only;

    Open Cur;

    Fetch Next from Cur into @IndexName;

    While @@FETCH_STATUS = 0
        Select @SQL = 'Alter Index ' + @IndexName + ' on ' + @TableName + ' REBUILD;';
        Execute (@SQL);
        print @TableName + '.' + @IndexName + ' Rebuilt...'
        Fetch Next from Cur into @IndexName;
    Close Cur;
    Deallocate Cur;

Wednesday, February 27, 2013

Quick T-SQL Tip #2: Identity Columns Seed Value

This post really deals with high volume tables with an identity column, but the idea is something that everyone should at least be familiar with.

Quite often I see table creation scripts that start like this:

Create Table ATable
ID int not null identity(1,1)

And while that is perfectly legal and will work with no issues, if this table is going to be a high volume table, you should consider using the entire Int, not just half of one.


The Int data type spans -2,147,483,648 thru 2,147,483,647.  If your seed is 1, you lose 2,147,483,649 values that could be used.

I have seen production situations where they have run out of identity values because the seed was 1.
So, next time you create an Identity, ask yourself: "Two billion, or four billion?"

and then 

Create Table ATable
ID int not null identity(-2147483648,1)