Search results matching tags '2008 R2' and '2008'http://sqlblog.com/search/SearchResults.aspx?o=DateDescending&tag=2008+R2,2008&orTags=0Search results matching tags '2008 R2' and '2008'en-USCommunityServer 2.1 SP2 (Build: 61129.1)Trapping SQL Server Errors with Extended Eventshttp://sqlblog.com/blogs/davide_mauri/archive/2013/03/17/trapping-sql-server-errors-with-extended-events.aspxSun, 17 Mar 2013 16:45:06 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:48282manowar<p>One very useful usage of Extended Events is the ability to trap SQL Server error without the need to have a server trace running (which, btw, is deprecated), with the additional feature of being able to query the data as soon as it comes in. This means that we a solution to monitor and trap errors as soon as they happen can be easily created, in order to help developers to fix problems as soon as they are detected. This is really, really, really helpful especially in very big applications, where the code base is quite old and there is no-one really knowing everything of the solution.</p> <p>To start a Extended Events sessions in order to trap SQL Server errors with severity greater than 10, just run the following script:</p> <p><font size="2" face="Courier New">CREATE EVENT SESSION [error_trap] ON SERVER <br />ADD EVENT sqlserver.error_reported <br />&#160;&#160;&#160; ( <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; ACTION&#160;&#160;&#160; (package0.collect_system_time,package0.last_error,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_id,sqlserver.database_name,sqlserver.nt_username,sqlserver.plan_handle,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.tsql_frame,sqlserver.tsql_stack,sqlserver.username) <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; WHERE&#160;&#160;&#160; ([severity]&gt;10) <br />&#160;&#160;&#160; ) <br />ADD TARGET package0.event_file <br />&#160;&#160;&#160; ( <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; SET filename=N'D:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\XEvents\error_trap.xel' <br />&#160;&#160;&#160; ) <br />WITH <br />&#160;&#160;&#160; (&#160;&#160;&#160;&#160;&#160;&#160;&#160; <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; STARTUP_STATE=OFF <br />&#160;&#160;&#160; ) <br />GO</font></p> <p><font size="2" face="Courier New">ALTER EVENT SESSION [error_trap] ON SERVER <br />STATE = START; <br />GO</font></p> <p>The problem with Exended Events is that they only talks XML which is surely flexible and extensible, but not at all confortable to be queried. That’s why I prefer to turn it into something relational. Using the xml nodes function and pivoting data can make the trick:</p> <p><font size="2" face="Courier New">IF (OBJECT_ID('tempdb..#e') IS NOT NULL) DROP TABLE #e <br />go</font></p> <p><font size="2" face="Courier New">WITH cte AS <br />( <br />&#160;&#160;&#160; SELECT <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; CAST(event_data AS XML) AS event_data <br />&#160;&#160;&#160; FROM <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; sys.fn_xe_file_target_read_file('D:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\XEvents\error_trap*.xel', NULL, NULL, NULL) <br />), <br />cte2 AS <br />( <br />&#160;&#160;&#160; SELECT <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; event_number = ROW_NUMBER() OVER (ORDER BY T.x) <br />&#160;&#160;&#160; ,&#160;&#160;&#160; event_name = T.x.value('@name', 'varchar(100)') <br />&#160;&#160;&#160; ,&#160;&#160;&#160; event_timestamp = T.x.value('@timestamp', 'datetimeoffset') <br />&#160;&#160;&#160; ,&#160;&#160;&#160; event_data <br />&#160;&#160;&#160; FROM <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; cte&#160;&#160;&#160; <br />&#160;&#160;&#160; CROSS APPLY <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; event_data.nodes('/event') T(x) <br />) <br />SELECT * INTO #e FROM cte2 <br />go</font></p> <p><font size="2" face="Courier New">WITH cte3 AS <br />( <br />&#160;&#160;&#160; SELECT <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; c.event_number, <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; c.event_timestamp, <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; --data_field = T2.x.value('local-name(.)', 'varchar(100)'), <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; data_name = T2.x.value('@name', 'varchar(100)'), <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; data_value = T2.x.value('value[1]', 'varchar(max)'), <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; data_text = T2.x.value('text[1]', 'varchar(max)') <br />&#160;&#160;&#160; FROM <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; #e c <br />&#160;&#160;&#160; CROSS APPLY <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; c.event_data.nodes('event/*') T2(x) <br />), <br />cte4 AS <br />( <br />&#160;&#160;&#160; SELECT <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; * <br />&#160;&#160;&#160; FROM <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; cte3 <br />&#160;&#160;&#160; WHERE <br />&#160;&#160;&#160;&#160;&#160;&#160;&#160; data_name IN ('error_number', 'severity', 'message', 'database_name', 'database_id', 'client_hostname', 'client_app_name', 'collect_system_time', 'username') <br />) <br />SELECT <br />&#160;&#160;&#160; * <br />FROM <br />&#160;&#160;&#160; cte4 <br />PIVOT <br />&#160;&#160;&#160; (MAX(data_value) FOR data_name IN ([error_number], [severity], [message], database_name, database_id, username, client_hostname, client_app_name, collect_system_time)) T <br />WHERE <br />&#160;&#160;&#160; [severity] &gt; 10 <br />ORDER BY <br />&#160;&#160;&#160; event_timestamp DESC <br />go</font> <br /></p> <p>and voilà, now all errors can be easily identified:</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_7116168A.png"><img title="image" style="border-top:0px;border-right:0px;background-image:none;border-bottom:0px;padding-top:0px;padding-left:0px;border-left:0px;display:inline;padding-right:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_0C1EBFCC.png" width="1024" height="83" /></a></p> <p>It’s now very easy to create reports and dashboards to monitor the system in (near) real time. And this is vital when you run a 24x7 online business.</p>Forcing “Custom Properties” of a Data Flow Transformation to support DTS Expressionhttp://sqlblog.com/blogs/davide_mauri/archive/2013/01/30/forcing-custom-properties-of-a-data-flow-transformation-to-support-dts-expression.aspxWed, 30 Jan 2013 16:48:22 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:47396manowar<p>Today I was using a 3rd Party Data Flow component that has several Custom Properties for which I need to change some of their values at runtime using a DTS Expression.</p> <p>To mimic the situation let’s use the “Percentage Sampling” that has two Custom Properties:</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_27FF2363.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_4E6106AE.png" width="406" height="79" /></a></p> <p>Of the two Custom Properties only some (one in this case) are available also outside that dataflow, so that they can be targeted by a DTS Expression. Such properties are listed under the “Misc.” section of Data Flow properties</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_66F0A3FE.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_7F140E59.png" width="399" height="318" /></a></p> <p>and also in the Property Expression Editor window:</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_2C28FB28.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_2B50953E.png" width="574" height="344" /></a></p> <p>Now, what if you need to make also the “hidden” custom properties available for DTS Expression usage? As you may have noticed, the <em>SamplingSeed</em> is not exposed outside the Data Flow. I tried to search the web, but after several minutes I gave up since I wasn’t able to find anything that could help. I then started to look into the SSIS object model and I found a nice property named <em>expressionType </em>in the <em>IDTSCustomProperty </em>interface that tells to the engine if the property value can be specified using DTS Expression or not:</p> <p><a title="http://msdn.microsoft.com/en-us/library/bb510794.aspx" href="http://msdn.microsoft.com/en-us/library/bb510794.aspx">http://msdn.microsoft.com/en-us/library/bb510794.aspx</a></p> <p>If the value is set to <em>Notify</em> than the usage of DTS Expression is possible otherwise, if the property is set to <em>None</em>, as the name implies, DTS Expression cannot be used.</p> <p>So all you need to do is to open the .dtsx file, look for the component you want to touch and its properties</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_5C6FCFDE.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_49BAD627.png" width="527" height="69" /></a></p> <p>and add (if not exists) or change the <em>expressionType</em> attribute to <em>Notify</em>:</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_221459FD.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_7A6DDDD2.png" width="520" height="82" /></a></p> <p>Save the file and voilà, the property is now available for DTS Expression usage!</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_39CB9163.png"><img title="image" style="border-left-width:0px;border-right-width:0px;background-image:none;border-bottom-width:0px;padding-top:0px;padding-left:0px;display:inline;padding-right:0px;border-top-width:0px;" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_401267F1.png" width="544" height="160" /></a></p> <p>Now,<strong> just be aware that is a sort of an hack, so double check it works for you.</strong> On the 3rd party components we’re using it works like a charm, and it just saved that day since without the ability to change some properties at run time, the deployment of our package in production could have been a nightmare.</p> <p>I tested this approach both on SQL Server 2012 and SQL Server 2008 and in both cases I hadn’t had any problems.</p> <p>Hope this helps someone <img class="wlEmoticon wlEmoticon-smile" style="border-top-style:none;border-left-style:none;border-bottom-style:none;border-right-style:none;" alt="Smile" src="http://sqlblog.com/blogs/davide_mauri/wlEmoticon-smile_186BEBC7.png" />, enjoy!</p>Power Pivot SQL Agent Job Monitorhttp://sqlblog.com/blogs/davide_mauri/archive/2013/01/07/power-pivot-sql-agent-job-monitor.aspxMon, 07 Jan 2013 21:23:29 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:47058manowar<p>In the last days I needed a quick and efficient way to monitor the performance of a job and of each step over time. There’s a lot of interesting data in MSDB database and the various <em>sysjob</em> tables, but the stored data is not really easy to use outside the Management Studio. Date and Time are not stored as date/time data type, there is no easy way to correlate the performance of a step with the related job execution (for example is not that easy to return all the steps of a job executed, say, yesterday at 10:00).</p> <p>So I decided to write some views to make the querying easier and then the next move was to create a Tabular Model with Power Pivot on Excel to make also analysis easier.</p> <p>Since this is a somehow recurring task that I find myself doing during my job – not often, but when I have to do it now I have something much more powerful in my hands – I decided to put it also con CodePlex so that that little work can be shared with the community.</p> <p>You can find the scripts with the objects (3 UDFs and 4 Views) that must be created inside MSDB and the Power Pivot Excel 2010 model here:</p> <p><a title="http://jobmonitor.codeplex.com/" href="http://jobmonitor.codeplex.com/">http://jobmonitor.codeplex.com/</a></p> <p>Enjoy!</p>SSAS DMVs: useful linkshttp://sqlblog.com/blogs/davide_mauri/archive/2012/05/18/ssas-dmvs-useful-links.aspxFri, 18 May 2012 14:47:42 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:43450manowar<p>From time to time happens that I need to extract metadata informations from Analysis Services DMVS in order to quickly get an overview of the entire situation and/or drill down to detail level. As a memo I post the link I use most when need to get documentation on SSAS Objects Data DMVs:</p> <p>SSAS: Using DMV Queries to get Cube Metadata <br /><a title="http://bennyaustin.wordpress.com/2011/03/01/ssas-dmv-queries-cube-metadata/" href="http://bennyaustin.wordpress.com/2011/03/01/ssas-dmv-queries-cube-metadata/">http://bennyaustin.wordpress.com/2011/03/01/ssas-dmv-queries-cube-metadata/</a></p> <p>SSAS DMV (Dynamic Management View) <br /><a title="http://dwbi1.wordpress.com/2010/01/01/ssas-dmv-dynamic-management-view/" href="http://dwbi1.wordpress.com/2010/01/01/ssas-dmv-dynamic-management-view/">http://dwbi1.wordpress.com/2010/01/01/ssas-dmv-dynamic-management-view/</a></p> <p>Use Dynamic Management Views (DMVs) to Monitor Analysis Services <br /><a title="http://msdn.microsoft.com/en-us/library/hh230820.aspx" href="http://msdn.microsoft.com/en-us/library/hh230820.aspx">http://msdn.microsoft.com/en-us/library/hh230820.aspx</a></p>DTLoggedExec 1.1.2008.4 SP2 released!http://sqlblog.com/blogs/davide_mauri/archive/2012/02/01/dtloggedexec-1-1-2008-4-sp2-released.aspxWed, 01 Feb 2012 17:15:44 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:41493manowar<p>I’ve released couple of hours ago the SP2 of my <a title="DTLoggedExec" href="http://dtloggedexec.davidemauri.it" target="_blank">DTLoggedExec</a> tool.</p> <p>For those who don’t know it, it’s a DTEXEC replacement, useful to execute SSIS and having logging provided right from the engine and not from the package itself.</p> <p>More info can be found here:</p> <p><a title="http://dtloggedexec.codeplex.com/" href="http://dtloggedexec.codeplex.com/">http://dtloggedexec.codeplex.com/</a></p> <p>This SP2 release add an important feature to the CSV Log Provider. It's now possible to store a personalized label into each log, in order to make it easy to identify or group logs.</p> <p>Let's say, for example, that you have 10 packages in your ETL solution, and each time you have to load your data, you need to execute all those 10 packages. In other words , you have a batch made of 10 packages. It would be nice if all logs - one for each package - can be identified as a whole and grouped together, in order to quickly identify all the log of a articular batch. </p> <p>This will make things easier when you want to know the overal time consumed by each batch execution.</p> <p>The new &quot;ExecutionLabel&quot; attribute will help to achieve this. A useful ExecutionLabel can be obtained using SQL Server Agents Token. For example:</p> <p>LABEL=$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))</p> <p><b>ChangeLog</b></p> <ul> <li>Updated the CSV Log Provider in order to write the status of the log file in the header. <ul> <li>An OPEN status means that the log file is being written. </li> <li>A CLOSED status means that the log files has been written correctly. </li> <li>A file can be loaded into the database log only if is in the CLOSED state.</li> </ul> </li> <li>CSV Log files have 2 additional rows in the header: one for the FileStatus and one for the ExecutionLabel values. <ul> <li>File format has been update to 4 from 3. (Only the header section of the file has been changed).</li> </ul> </li> <li>CSV Log Provider will now display EndTime value to the Console </li> <li>Updated the import-log-data.sql to correctly load file with format 3 (the old one) and 4 (the new one). </li> <li>Updated database schema to version 19 in order to store the new ExecutionLabel value</li> <li>Update samples in order to show how to use the new ExecuteLabel option</li> </ul> <p>As usual the download is available for free here:</p> <p><a title="http://dtloggedexec.codeplex.com/releases/81526/download/336490" href="http://dtloggedexec.codeplex.com/releases/81526/download/336490">http://dtloggedexec.codeplex.com/releases/81526/download/336490</a></p>SYS2 scripts updated (December 2012)http://sqlblog.com/blogs/davide_mauri/archive/2011/12/12/sys2-scripts-updated-december-2012.aspxMon, 12 Dec 2011 22:26:43 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:40288manowar<p>I’ve updated my SYS2 scripts:</p> <ul> <li>Added a new script to see how much buffer cache memory is used by each database </li> <li>Updated the sys2.stats script in order to have only one row per statistics </li> <li>Updated the sys2.query_stats script to use the sys.dm_exec_plan_attributes dmv to get better information on which database was used by the cached plans </li> </ul> <p>As usual they are available from CodePlex:</p> <p><a title="http://sys2dmvs.codeplex.com/" href="http://sys2dmvs.codeplex.com/">http://sys2dmvs.codeplex.com/</a></p> <p>Enjoy!</p>Spotlight session at PASS 2011 - Temporal Snapshot Fact Tablehttp://sqlblog.com/blogs/davide_mauri/archive/2011/05/29/spotlight-session-at-pass-2011-temporal-snapshot-fact-table.aspxSun, 29 May 2011 13:44:59 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:35953manowar<p>I’m very happy to announce that my proposal for the Spotlight session I’ve been invited to deliver at PASS 2011 has been accepted!</p> <blockquote> <p><strong>Temporal Snapshot Fact Table</strong></p> <p>You are designing a BI Solution and your customer ask you to keep a snapshot of the status of all their documents (orders, insurances, contracts, bills...whatever the word &quot;document&quot; may mean) for all the days of the year. They have millions of documents and they want to have in their Data Warehouse all the data they have gathered right from the very first operating day.</p> <p>If you have 1 million of documents (on average) and you have to keep a snapshot of them for each one of the 365 days in a year, and you have 10 year of history, you're going to have a 3 billions table just to start with. That's a very big and challenging number, and you may have not the option to buy a Parallel Data Warehouse. </p> <p>In this session, we'll see how we can turn the usual snapshot tables into temporal table so that we can store time intervals in order to avoid data duplication, while keeping the Data Warehouse design usable by Analysis Services (that doesn't know what an interval is) and optimizing it to have very good performance even on standard hardware.</p> <p>The explained technique is a result of several month of research and has been applied to the Data Warehouse of an insurance company where we had to deal with two times the number said before. </p> </blockquote> <p>The topic is very interesting and – I bet – very intriguing for many people working in BI and specially with Analysis Services, since it lacks the support of “time intervals” to define the validity period of a fact row. With my SolidQ Italian collegues we’ve been able to find a way to overcome this limitation, allowing the storage of daily snapshots of data with a very high efficency and performance.</p> <p>In this session I’m going to share everything we discovered with you. It will be really interesting, I can tell you! Probably one of the most advanced – yes simple - usage of SSAS and Many-To-Many relationship you’re going to see.</p>SYS2 Scripts Updated – Scripts to monitor database backup, database space usage and memory grants now availablehttp://sqlblog.com/blogs/davide_mauri/archive/2011/02/13/sys2-scripts-updated-scripts-to-monitor-database-backup-database-space-usage-and-memory-grants-now-available.aspxSun, 13 Feb 2011 17:30:22 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:33451manowar<p>I’ve just released three new scripts of my “sys2” script collection that can be found on CodePlex:</p> <p>Project Page: <a title="http://sys2dmvs.codeplex.com/" href="http://sys2dmvs.codeplex.com/">http://sys2dmvs.codeplex.com/</a></p> <p>Source Code Download: <a title="http://sys2dmvs.codeplex.com/SourceControl/changeset/view/57732" href="http://sys2dmvs.codeplex.com/SourceControl/changeset/view/57732">http://sys2dmvs.codeplex.com/SourceControl/changeset/view/57732</a></p> <p>The three new scripts are the following</p> <ul> <li>sys2.database_backup_info.sql </li> <li>sys2.query_memory_grants.sql </li> <li>sys2.stp_get_databases_space_used_info.sql </li> </ul> <p>Here’s some more details:</p> <blockquote> <p><strong>database_backup_info</strong></p> <p>This script has been made to quickly check if and when backup was done. It will report the last full, differential and log backup date and time for each database. Along with these information you’ll also get some additional metadata that shows if a database is a read-only database and its recovery model:</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_4B56D20D.png"><img style="background-image:none;border-right-width:0px;padding-left:0px;padding-right:0px;display:inline;border-top-width:0px;border-bottom-width:0px;border-left-width:0px;padding-top:0px;" title="image" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_269594C9.png" width="1028" height="109" /></a></p> <p>By default it will check only the last seven days, but you can change this value just specifying how many days back you want to check.</p> <p>To analyze the last seven days, and list only the database with FULL recovery model without a log backup</p> <p><font face="Courier New">select * from sys2.databases_backup_info(default) <br />where recovery_model = 3 and log_backup = 0</font></p> <p>To analyze the last fifteen days, and list only the database with FULL recovery model with a differential backup</p> <p><font face="Courier New">select * from sys2.databases_backup_info(15) <br />where recovery_model = 3 and diff_backup = 1</font></p> <p>I just love this script, I use it every time I need to check that backups are not too old and that t-log backup are correctly scheduled.</p> <p><strong>query_memory_grants</strong></p> <p>This is just a wrapper around sys.dm_exec_query_memory_grants that enriches the default result set with the text of the query for which memory has been granted or is waiting for a memory grant and, optionally, its execution plan</p> <p><strong>stp_get_databases_space_used_info</strong></p> <p>This is a stored procedure that list all the available databases and for each one the overall size, the used space within that size, the maximum size it may reach and the auto grow options. This is another script I use every day in order to be able to monitor, track and forecast database space usage.</p> </blockquote> <p>As usual feedbacks and suggestions are more than welcome!</p>BI Virtual Chapter: “Adaptive BI: Engineering a BI Solution”http://sqlblog.com/blogs/davide_mauri/archive/2010/09/09/bi-virtual-chapter-adaptive-bi-engineering-a-bi-solution.aspxThu, 09 Sep 2010 09:54:27 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:28643manowar<p>On the next 20th Semptember I’ll deliver a session on a topic that is IMHO really important but is not yet covered enough:</p> <blockquote> <p><b>Adaptive BI: Engineering a BI Solution</b><b></b></p> <p>Are you starting to create a BI solution....but where do you begin with? How to setup everything correctly so that you'll be able to handle new features required by the customers easily and without too much effort? Which standard do you put in place so that everyone who will join the project will be immediately operative? How do you define your BI architecture so that it can be sound and flexible and the same time?&#160; All these questions come out every time someone has to start a BI project. In this session I'll present a set of standard rules we use to create our BI project, consolidated in more than five years of development (since SQL 2005 was released) and that allow to put some technical rules that helps you to start in the correct way right from the beginning: naming convention, architecture decisions, database unit testing, layering and everything you need to know to build an &quot;Adaptive&quot; Business Intelligence solution. The idea is to be flexible in terms of architectural decision but have some well-known rules in the solution to make it “engineerable”: in one word...&quot;Adaptive&quot;!</p> </blockquote> <p>If you’re into BI this session can give you some interesting ideas on how to structure your next project. See you in cyberspace here</p> <p><b><a href="https://www.livemeeting.com/cc/usergroups/join?id=C97PWB&amp;role=attend&amp;pw=Z%40%2B%5Ew%2B%40h4">https://www.livemeeting.com/cc/usergroups/join?id=C97PWB&amp;role=attend&amp;pw=Z%40%2B%5Ew%2B%40h4</a> </b></p> <p>on Sept 20th at 12pm EST.</p> <p>PS</p> <p>At the end of the session I’ll show also how the rules can be applied automatically with the right tools, creating an entire package to load a dimension in less then 1 minute, with all best pratices and standard in place!</p> <p><a href="http://sqlblog.com/blogs/davide_mauri/image_2EF590FC.png"><img style="border-bottom:0px;border-left:0px;display:inline;border-top:0px;border-right:0px;" title="image" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_0B0CB9A2.png" width="226" height="244" /></a> <a href="http://sqlblog.com/blogs/davide_mauri/image_023CB156.png"><img style="border-bottom:0px;border-left:0px;display:inline;border-top:0px;border-right:0px;" title="image" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_21DB6551.png" width="244" height="223" /></a> <a href="http://sqlblog.com/blogs/davide_mauri/image_4A2A97A5.png"><img style="border-bottom:0px;border-left:0px;display:inline;border-top:0px;border-right:0px;" title="image" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_1E0D9AB4.png" width="244" height="180" /></a><a href="http://sqlblog.com/blogs/davide_mauri/image_55DFDED7.png"><img style="border-bottom:0px;border-left:0px;display:inline;border-top:0px;border-right:0px;" title="image" border="0" alt="image" src="http://sqlblog.com/blogs/davide_mauri/image_thumb_45F09A13.png" width="161" height="244" /></a></p>PASS Summit 2010 – BI Workshophttp://sqlblog.com/blogs/davide_mauri/archive/2010/06/30/pass-summit-2010-bi-workshop.aspxWed, 30 Jun 2010 12:01:10 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:26596manowar<p>PASS Summit 2010 Pre &amp; Post conference <a href="http://sqlpass.eventpoint.com/PrePostConferenceSessions">are out</a>! This year I’ll deliver the “<a href="http://sqlpass.eventpoint.com/topic/details/TF1588">Creating BI Solution from A to Z</a>” seminar in which, as the title implies, attendees will see how to create a BI solution starting from scratch. Going through the dimensional modeling and the creation of the Datawarehouse, the implementation of the ETL process with SSIS, the creation of cube with Analysis Services and reports with Reporting Services with, if time permits, also a glance at PowerPivot, attendees will get a solid ground on the whole process that drives the creation of a BI solution.</p> <p>The workshop will also show all the best practices and the best methodological approach matured in more than 5 years of working in the BI field.</p> <p>If you’ll be working on BI or you already have a created an initial BI solution and you want to be sure you’re following the right path, this is workshop is for you!</p> <p>Don’t miss it!</p>