Tuesday, December 19, 2017

PASS Summit Video Recordings

Image by workandapix via pixabay

PASS (Professional Association for SQL Server) Summit is the largest conference for data professional working with Microsoft SQL Server and Business Intelligence stack. Typically, PASS would have the Summit once a year (in the fall) in Seattle, Washington. PASS Summit offers a wealth of knowledge. Each year, there are hundreds of sessions on various SQL Server and Business Intelligence topics presented by industry experts and also Microsoft employees. If you are a data professional, who are working with Microsoft SQL Server and Business Intelligence, PASS Summit is a great event to learn and network with fellow data professionals. If you were unable to attend the past PASS Summits (or if you attended the past PASS Summits and want to relive and review those sessions), you can go to SQL PASS site and watch the past PASS Summit Session Recordings for free. As of the writing of this blog post, you can view recordings for PASS Summit 2015 and older for free (you need to be a member of SQL PASS but the registration is free). Even though they might be a few years old, they are excellent learning resources.

By the way, if you can attend PASS Summit in person, I would recommend it. PASS Summit 2018 is going to be in Seattle, Washington, in November 2018. You can get more information from PASS Summit 2018 site.

Tuesday, December 12, 2017

T-SQL Tuesday #97 - 2018 Learning Goals

This month T-SQL Tuesday is hosted by Malathi Mahadevan (blog | twitter) and it is about 2018 learning goals. Given that we are at the end of 2017, this is a timely topic. As data professionals and technology enthusiasts, we are living in an interesting time. There are a lot of opportunities to learn, so much so that we might feel overwhelmed by them. Goals can certainly help to focus our learning efforts.

Learning Goals

In 2018, I am planning to learn more on architect-ing and managing cloud solution. As I gain experience working with databases in the cloud, I feel that it would also be beneficial to learn about the different cloud components/services and how they can complement each other.

Ways to Learn

To help me focus, I am planning to get Amazon AWS and/or Microsoft Azure certification. Normally, Amazon and/or Microsoft have a guideline (or blueprint) that will tell you the things/skills that you need to know to pass the certification exam. That guideline (or blueprint) can serve as an outline of topics that we can use to learn. To learn the different topics I am planning to read published whitepapers, FAQ articles and blogs. Given the evolving nature of cloud technologies, blogs can help to keep us updated with the latest information or features.

I am also planning to learn by taking online courses. I enjoy doing online courses. It allows me to study at my own time and pace. There are some Amazon AWS and Microsoft Azure online courses available on Udemy and edx that I am planning to take.

Finally, I am also a believer in learning by doing. When learning new concept/technology, I like to experiments and have hands-on experience. The good thing is that both Amazon and Microsoft offer us opportunities for us to have hands-on experience with most of their cloud offering for free for 12-months. For Amazon, we can sign up for AWS free tier. For Microsoft, we can sign up here.

Friday, December 8, 2017

Backup Analysis Services Database Using PowerShell

PowerShell

We can use PowerShell to backup Analysis Services (AS) multidimensional or tabular database. This is great because it offers us more flexibility in automating the AS database backup process. Typically, we might have multiple Analysis Service databases within an Analysis Services instance. As a good Database Administrator responsible for the Analysis Services, we want to backup all those Analysis Services databases.

The PowerShell script below can be used to backup all databases within an Analysis Services instance. To accomplish this, we use Analysis Management Objects (AMO) to get the list of databases available within the Analysis Services instance. Then we use the Backup-ASDatabase cmdlet to backup each database to Analysis Services backup (.abf) file. Alternatively, we could use the AMO backup method to backup the Analysis Services database, but for this script, I choose to use the Backup-ASDatabase cmdlet.

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$ServerName = "ServerName"
$BackupPath = "C:\Temp"

$Dt = Get-Date -f yyyyMMddhhmmss

# Make sure that the backup folder exist.
$BackupLocation = "$BackupPath\$ServerName"
If (!(Test-Path $BackupLocation))
{
New-Item -ItemType Directory -Force -Path $BackupLocation
}

$Server = New-Object Microsoft.AnalysisServices.Server
$Server.Connect($ServerName)

# Loop thru each database within the SSAS server.
foreach ($Database in $Server.Databases)
{
$DatabaseName = $Database.Name
$BackupFile = $BackupLocation + "\" + $DatabaseName + "_" + $dt + ".abf"

Write-Host "Starting to backup $DatabaseName ..." -ForegroundColor Green
Backup-ASDatabase -Server $ServerName -BackupFile $BackupFile -Name $DatabaseName -ApplyCompression -ErrorAction Stop
Write-Host "$DatabaseName has been backed up successfully." -ForegroundColor Green
}

Reference

  • Backup-ASDatabase cmdlet - The site contains information on how to use the Backup-ASDatabase cmdlet. It also provides you with the different parameters that you can use. You can get the information by running the "get-help Backup-ASDatabase -detailed" on your PowerShell window.