Sensible Data Integrations http://sensibledataintegrations.com Wed, 05 Sep 2018 04:16:11 +0000 en-US hourly 1 SQL Server 70-768: Developing SQL Data Models (Video Training) http://sensibledataintegrations.com/2018/04/18/pearsononlinelearning/ Wed, 18 Apr 2018 21:20:49 +0000 http://sensibledataintegrations.com/?p=707 SQL Server Analysis Services (SSAS) provides a robust semantic environment for modeling data in support of analytic and ad hoc reporting needs. With each release of SQL Server, SSAS functionality is enhanced, with tabular models in particular becoming more capable and fully featured.

In this video, Scot Reagin couples discussion with demonstration to give you not just an understanding of the functional differences between multidimensional and tabular models, but also the context to understand where and why each should be implemented.

In addition to a comprehensive discussion of design and implementation topics, Scot covers techniques for developing queries using multidimensional and data analysis expressions, optimizing processing and query performance, and configuring and maintaining SSAS.

Lesson Descriptions

Lesson 1,  Designing a Multidimensional Semantic Model, covers building a multidimensional semantic model—a Cube. Create the fundamental components for the model, dimensions, and measures as a project and deploy this to Analysis Service.

Lesson 2, Designing a Tabular Semantic Model, demonstrates how to build and deploy tabular models while exploring security and refresh options.

Lesson 3, Developing Queries Using MDX and DAX, covers developing queries and expressions using MDX and DAX. Understanding these query languages is essential to optimizing Analysis Services models—whether multidimensional or tabular.

Lesson 4, Configuring and Maintaining SSAS, looks at ways to monitor and configure Analysis Services for best performance.

Click here to view video and purchase options.

]]>
Is Your Cube Corrupt? http://sensibledataintegrations.com/2018/03/29/is-your-cube-corrupt/ Thu, 29 Mar 2018 01:30:50 +0000 http://sensibledataintegrations.com/?p=556 The Database Console Commands (DBCC) are a collection of Transact-SQL statements that allow you to perform maintenance tasks, validate operations on a database or database component, retrieve SQL Server information, and other miscellaneous tasks such as enabling a trace flag. These are critical capabilities when you’re responsible for ensuring your multi-dimensional environment is working properly. DBCC is sometimes referred to as Database Consistency Checks, something it was referred to as in some of the earliest versions of SQL Server.

This article will show you how to execute DBCC for Analysis Services using XMLA queries.  The included examples were created on a Windows 2016 Azure Virtual machine running SQL Server 2017. The WideWorldImporters and WideWorldImportersDW databases were used along with the Multidimensional project for WideWorldImporters provided by Microsoft via GitHub. Microsoft currently does not provide a sample Tabular Project for WideWorldImporters, so I created a simple Tabular database with its compatibility set to 1400.

While DBCC validation commands have existed for relational databases going back to some of the first versions of SQL Server, it wasn’t until the release of SQL Server 2016 that a similar, though more narrow, set of commands was made available for Analysis Services.  These new commands allow you to use either XMLA or MDX queries to execute DBCC in SQL Server Management Studio. DBCC for Analysis Services can be used to validate both Multidimensional and Tabular databases at any compatibility level of SQL Server 2016 or higher. There are two command syntaxes: one for Multidimensional or Tabular compatibility level 1100 and 1103 databases and another for Tabular model databases with a compatibility level or 1200 or higher

You will need to either be a member of the Full Control (Administrator) database role for the database you want to execute the DBCC commands against or be a member of the Server Administrator role.

DBCC command for Multidimensional models

For Multidimensional models, The Database Consistency Checker (DBCC) for Analysis Services only checks a multidimensional database’s partition indexes. The checks include validating a partitions index’s metadata, looks for the existence of physical corruption, and checks segment statistics and indexes. It does this by creating temporary indexes and compares them to the partition indexes saved to disk.

There are several parameters you can pass to the DBCC command.  The first is Database ID, which you can find by right clicking on the data in Management Studio and selecting properties.


Once you have the Database ID, you can pass it as a parameter to execute a consistency check against the entire database.  The database is not corrupt if the results returned are empty.

If you click on the message tab, you may see details about the partitions that were checked.

You may see less information returned in the message tab if you’re executing the command against a small database.

The next parameter you might pass with the database id is cube id.

You can also pass a measure group id parameter.

Finally, you might pass the partition id of a measure group.

DBCC command for Tabular models with a compatibility level 1200 or higher

The Database Consistency Checker (DBCC) for Analysis Services has a more comprehensive set of consistency checks for Tabular models compared to Multidimensional. It validates multiple types of Tabular objects for corruption including, databases, tables, partitions, relationships, hierarchies, columns and more. The checks performed are the same validation steps that are run when you restore, synchronize, or reload a Tabular database.

Similar to the Multidimensional model, the first parameter you might pass is database id.

The next parameter you can pass is the name of a table in the database.

Finally, you can pass the name of a partition name of a table.

Final thoughts

The Database Consistency Checker (DBCC) for Analysis Services allows you to easily check your multidimensional or tabular model database. If you do have corruption, the error messages returned will help you determine next steps. Microsoft provides a list of checks and errors along with some common error conditions and how to resolve them.

If you have other questions, we’re always available. Drop us a line.

]]>
SQL Saturday: Is Data Positioning Self-Defense or a Brilliant New Architecture? http://sensibledataintegrations.com/2018/02/27/sql-saturday/ Tue, 27 Feb 2018 18:56:25 +0000 http://sensibledataintegrations.com/?p=520 Join Sensible Data Integration Saturday, March 24th 2018 in Colorado Springs for SQL Saturday.  Scot Reagin will be speaking about Data Positioning from 9:40-10:40 in room 6.

“The Cloud, streaming data, big data, self-service, machine learning, AI and the internet of things…. clearly we’re not in Data Management Kansas anymore. Data Modelers and Architects have been reacting to wave after wave of demands from these communities with new technologies and methodologies but too often remain grounded in outdated concepts of data management principles. Data Positioning is an evolution of this thinking, allowing Data Managers to deliver value today and in the future.”

Location: Colorado Technical University 4435 N Chestnut St, Colorado Springs, CO 80907

Learn more about the free event and register

 

]]>
Global Data Summit Golden, CO October 2-3, 2017 http://sensibledataintegrations.com/2017/08/07/global-data-summit/ Mon, 07 Aug 2017 14:36:55 +0000 http://sensibledataintegrations.com/?p=511 Scot Reagin will speak at the upcoming Global Data Summit in Golden, Colorado on October 2nd and 3rd, 2017.

The conference brings together data industry experts to discuss innovation and the latest trends.  Among the topics the twenty-six expert speakers will address are:

  • Streaming data
  • Quantum computing for IT managers
  • Big data modeling
  • Agile warehouse architecture client cases
  • The near term future of ensemble modeling
  • Business intelligence markup language (BIML) client case

Join us to share in the discussion of data innovation with experts working with the biggest companies in the world.

Learn more and register

 

]]>
Join us for SQL Saturday Omaha July 22, 2017 http://sensibledataintegrations.com/2017/07/13/sql-saturday-omaha-july-22-2017/ Thu, 13 Jul 2017 17:02:06 +0000 http://sensibledataintegrations.com/?p=456

Scot Reagin and Steve Nogradi will be speaking July 22 at SQL Saturday  at Mammel Hall, University of Nebraska-Omaha.

 

Agile Data Modeling with Data Vault  is the theme for the 8:30 a.m. session.

 

Agility and Business Intelligence are two good things that often struggle to be good together. A primary cause of this struggle is the inability of traditional Data Warehouse models to respond to change in a (business) timely manner. Data Vault is an evolution of Enterprise Data Warehouse modeling that removes the barriers to Data Warehouse agility. A Data Vault warehouse eliminates re-engineering both of the data schema and ETL as the model evolves in response to changing business needs and definitions.
In this session we’ll compare modeling techniques in real world scenarios to understand how Data Vault can make your EDW more capable and agile.

HUB data integration

During the 3:45- 5:00 session, Scot and Steve will share why the hero you need is Data Integration.

Every organization struggles to ensure data is up to date and consistent across multiple systems. Often this results in the development of layers of system to system data integrations or repetitive and manual processes intended to support operational and reporting needs.

Enterprise Data Warehouses and Marts are often the only stores that provide single point access to data sourced from multiple systems – but these stores can be expensive and complicated to build and maintain and tend to lag behind business need.

Data Lakes offer the promise of fast inexpensive data acquisition but without specialized tools lack the structure that makes their data accessible and useful to the business.

A managed Data Integration platform can provide both efficient data management across systems and a shared, single point, store of data sourced from all participating systems. A Data Integration platform solution is not technology or tool specific and can provide immediate and continuing value.

 

View schedule and Register

 

 

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

]]>
Data Vault Modeling Certification Courses in Sydney and Melbourne November, 2017 http://sensibledataintegrations.com/2017/07/10/data-vault-modeling-certification-courses-in-sydney-and-melbourne-november-2017/ Mon, 10 Jul 2017 10:16:05 +0000 http://sensibledataintegrations.com/?p=451 Join us in Australia this November for three-day Data Vault Modeling Certification Course.  Scot Reagin will be an instructor in Sydney November 15-17 and Melbourne November 20-22.

What you’ll learn:

This course will guide you through the Data Vault modeling approach from modeling constructs and patterns to applying data vault principles in your DWBI program. This course covers also loading paradigms, architectures, and how to develop an effective overall data vault data warehouse program. Since data vault may be new for many of you, this course also includes a summary of the benefits of using data vault techniques.

New in 2017:

Updates to the course for 2017 include Big Data modeling, revised UOW, Metrics Overlay, Ensemble Modeling, updated BK design, Hash Key options, new Raw/BDV materials, and new cases.

Format:

The hands-on course format leverages guidance from instructors as students attending the course model all three days in class.  This variety of real-world scenarios in work in the course means all students are able to learn from a range of activities.

The CDVDM course is a three (3) day intensive classroom based course. The course consists of five core components plus the certification exam. The classroom time is

40% Classroom Lecture, 30% Small Group Exercises and 30% Interactive Discussions.

 

Who should take this course:

The target audience for this course includes data warehousing and business intelligence professionals, data modelers, data architects, model managers, data warehouse DBAs, and ETL professionals. Because data vault modeling concepts are closely aligned with the business aspects of DWBI programs our target audience also includes program managers, business analysts, information modelers, information architects, BICC professionals, and data scientists.

 

Learn more and register:

Sydney

Melbourne

]]>
SQL Saturday Philadelphia June 3, 2017 http://sensibledataintegrations.com/2017/05/16/sql-saturday-philadelphia-june-3-2017/ Tue, 16 May 2017 20:40:10 +0000 http://sensibledataintegrations.com/?p=443 Read more about SQL Saturday Philadelphia June 3, 2017[…]]]> Scot Reagin and Steve Nogradi will be speaking at SQL Saturday in Philadelphia June 3rd.

Agility and Business Intelligence are two good things that often struggle to be good together. A primary cause of this struggle is the inability of traditional Data Warehouse models to respond to change in a (business) timely manner. Data Vault is an evolution of Enterprise Data Warehouse modeling that removes the barriers to Data Warehouse agility. A Data Vault warehouse eliminates re-engineering both of the data schema and ETL as the model evolves in response to changing business needs and definitions.
In this session we’ll compare modeling techniques in real world scenarios to understand how Data Vault can make your EDW more capable and agile.

Learn more

]]>
Data Integration: The Neglected Hero of Your Information Environment http://sensibledataintegrations.com/2017/02/08/data-integration-the-neglected-hero-of-your-information-environment/ Wed, 08 Feb 2017 18:51:09 +0000 http://sensibledataintegrations.com/?p=405 HUB data integration

SDI Founder Scot Reagin will be speaking March 25th at SQL Saturday on the campus of Colorado Technical University in Colorado Springs.

Session Preview:

Every organization struggles to ensure data is up to date and consistent across multiple systems. Often this results in the development of layers of system to system data integrations or repetitive and manual processes intended to support operational and reporting needs.

Enterprise Data Warehouses and Marts are often the only stores that provide single point access to data sourced from multiple systems – but these stores can be expensive and complicated to build and maintain and tend to lag behind business need.

Data Lakes offer the promise of fast inexpensive data acquisition but without specialized tools lack the structure that makes their data accessible and useful to the business.

A managed Data Integration platform can provide both efficient data management across systems and a shared, single point, store of data sourced from all participating systems. A Data Integration platform solution is not technology or tool specific and can provide immediate and continuing value.

SQL Saturday is a free training event for Microsoft Data Platform professionals and those wanting to learn about SQL Server, Business Intelligence and Analytics. This event will be held on Mar 25, 2017 at Colorado Technical University, 4435 North Chestnut Street, Colorado Springs, Colorado, 80907, United States

Register for or Learn more about SQL Saturday

 

SaveSave

]]>
Simplify Data Management with the SDI Integration Hub™ http://sensibledataintegrations.com/2017/01/03/simplify-data-environments-with-the-sdi-integration-hub/ Tue, 03 Jan 2017 19:54:36 +0000 http://sensibledataintegrations.com/?p=365 Every organization struggles to ensure data is up to date and consistent across multiple systems. Failing to solve this problem means managers spend more time resolving issues than running the company.

Diagnosing and resolving incorrect or inconsistent data is expensive and can result in the loss of current and potential business to more efficient competitors.

Repetitive manual processes used to move and manipulate data are labor intensive, costly, and prone to error. Expensive integration tools do not provide a complete solution either as administration and configuration of these tools require specialized skills and often hard to find resources.

Organizations need a reliable, cost-effective, automated solution to ensure critical data is where it needs to be when it needs to be there, today and in the future as their business grows.

The SDI Integration Hub™ was developed for exactly this purpose.

The SDI Integration Hub™ platform makes easy and reliable data migration and integration possible. Some organizations use the Hub to migrate data from one system to another. For others, the Hub provides continuous data integration among multiple systems.

The Hub’s modular architecture separates data from process and removes system-specific dependencies. This design allows the Hub to adjust to changing needs without incurring costly redevelopment of direct system-to-system processes. The Hub can be maintained with basic technical skills, and is designed to be both affordable and capable.

Putting to use more than 70 years of combined data management and business experience, the SDI team tailors training and support to the needs of each organization. The team ensures that the Hub aligns with the unique requirements of each organization whether the need is for a one-time data migration or operation-critical integration.

The SDI Integration Hub™ delivered by SDI Certified Implementers ensure Sensible solutions.

Learn more about how The SDI Integration Hub™ simplifies and adds value to your data environment.

]]>