tag:blogger.com,1999:blog-87926286374954465452024-03-13T06:04:01.495-04:00Lucas NotesNotes by Lucas KartawidjajaLucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.comBlogger67125tag:blogger.com,1999:blog-8792628637495446545.post-30719199193946896352024-02-13T15:18:00.005-05:002024-02-13T15:24:02.058-05:00T-SQL Tuesday #171 - Describe the Most Recent Issue You Closed<div class="separator" style="text-align: center; clear: both;"><a style="clear: left; margin-right: 1em; margin-bottom: 1em; float: left;" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s1600/sqltuesday.png" imageanchor="1"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s200/sqltuesday.png" width="99" height="100" data-original-width="129" data-original-height="130" /></a></div> <p>This month T-SQL Tuesday is hosted by Brent Ozar (<a href="https://www.brentozar.com/blog/" target="_blank">blog</a>). For this month's T-SQL Tuesday, we are asked to describe the most recent ticket or issue that we closed.</p><a name='more'></a>
<p>In not so distance past, one of the issues that I closed related to SQL Server was related to the Security Updates for Microsoft SQL Server ODBC and OLE DB Driver that was published in October 2023. Here's the information on the vulnerabilities:</p>
<ul>
<li><a target="_blank" href="https://www.tenable.com/plugins/nessus/183036">Security Updates for Microsoft SQL Server ODBC Driver</a></li>
<li><a target="_blank" href="https://www.tenable.com/plugins/nessus/182968">Security Updates for Microsoft SQL Server ODBC Driver</a></li>
</ul>
<p>The vulnerabilities are affecting Microsoft ODBC Driver 17 and 18, as well as OLE DB Driver 18 and 19. For more information and also download location for the security update/ hotfix can be found on the following page: <a target="_blank" href="https://techcommunity.microsoft.com/t5/sql-server-blog/update-hotfixes-released-for-odbc-and-ole-db-drivers-for-sql/ba-p/3848484">Update: Hotfixes released for ODBC and OLE DB drivers for SQL Server</a></p>
<p>We do an automated security scanning tool that would flag the systems (servers, desktops, latptops, etc.) that haven't been patched. So we can quickly identify the systems that need to be patch and patched those systems quickly.</p>
<p>For this post, I was wondering if there is a quick way to identify Microsoft ODBC and OLE DB drivers that are being installed on the systems. This is for a few reasons:</p>
<ul>
<li>To verify that the automated security scanning tool is flagging the right systems.</li>
<li>To verify quickly that the patch has been installed successfully on those system.</li>
<li>To capture the current installed version of the Microsoft ODBC and OLE DB drivers, just in case we need to roll back.</li>
</ul>
<p>One way that I could think of is by using PowerShell and use the Get-WmiObject if you are a Microsoft Windows shop. Here's the PowerShell script that we can use to see if Microsoft OLE DB driver and Microsoft ODBC 17 or 18 driver are installed on the system and it should also give the version number:</p>
<pre class="language-powershell;"><code>Get-WmiObject -Query "select * from win32_product where name = 'Microsoft OLE DB Driver for SQL Server' or name = 'Microsoft ODBC Driver 17 for SQL Server' or name ='Microsoft ODBC Driver 18 for SQL Server'" | Select-Object Name, Version
</code></pre>
<p>We can even pass in the computer name onto the PowerShell command to get the information from remote systems:</p>
<pre class="language-powershell;"><code>Get-WmiObject -Query "select * from win32_product where name = 'Microsoft OLE DB Driver for SQL Server' or name = 'Microsoft ODBC Driver 17 for SQL Server' or name ='Microsoft ODBC Driver 18 for SQL Server'" -ComputerName ComputerA,ComputerB,ComputerC | Select-Object PSComputerName, Name, Version
</code></pre>
<p>I am sure there are some other methods of identifying the version of Microsoft ODBC and OLE DB drivers that are being installed on your system, such as going to the file property of the dll files and look at the Product Version field as recommended in the <a target="_blank" href="https://techcommunity.microsoft.com/t5/sql-server-blog/update-hotfixes-released-for-odbc-and-ole-db-drivers-for-sql/ba-p/3848484">Microsoft</a> article. Please feel free to share in the comments if you have any other ways of identifying Microsoft ODBC and OLE DB driver version.</p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-13671581689326488362019-03-04T21:31:00.016-05:002021-06-08T22:38:53.857-04:00Trigger Windows Scheduled Task to Run Upon the Successful Completion of Another Scheduled Task<div class="separator" style="clear: both; text-align: center;"><a href="https://pixabay.com/illustrations/domino-game-falling-communication-163523/" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img align="left" alt="Trigger Tasks" border="0" height="153" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtOvW1hkALD4XPY3TR5PkemxdFqhr2yseNmqEirY7wR3Hc-yyUUFIy-7Bt5yJwK8XVkQOIQoe5MA3NNX64g-7YEtXXuVaFokHLBq72nCPoY9AHjY1ttXUdRHU0uEfZNmaJsrlrkqCdw51t/w200-h153/domino-163523_1280.jpg" style="background-image: none; display: inline; float: left;" title="Trigger Tasks" width="200" /></a></div><p>In one of the projects that I worked on, I have two tasks/jobs on Windows Task Scheduler, let say the name of the first job is “FirstJob” and the second job is “SecondJob”. I have a requirement where the “SecondJob” should run immediately after the “FirstJob” has been completed successfully. There are several ways to achieve this. For this particular project, to fulfil the requirement, on the “SecondJob”, I would trigger the “SecondJob” based “On an event”, which the the successful completion of the “FirstJob”.<span></span></p><a name='more'></a><p></p><p>The steps that I’ve taken to create the trigger on the “SecondJob”:</p><ul><li>On the “SecondJob” property, within the Task Scheduler, go to “Triggers” tab. And click on the “New” button.</li><li>On the “New Trigger” window, for “Begin the task”, choose “On an event.” Then on the “Settings” section, click on the “Custom” radio button. And click on the “New Event Filter…” button. <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPSeTYPD1EqR4rAjFurnlnHwmKChEIovjYW60QTGT9v9uWJTl5ILsJ46h4RTocVaq7bHMyOPRVHZYlXby9hbiPPqZXr69C8LBqXOLvl1LAN8RMk9wuraxpVQeHSLydZKLhktMXCRY0iXII/s1600/NewTrigger.png"><img border="0" data-original-height="516" data-original-width="591" height="279" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPSeTYPD1EqR4rAjFurnlnHwmKChEIovjYW60QTGT9v9uWJTl5ILsJ46h4RTocVaq7bHMyOPRVHZYlXby9hbiPPqZXr69C8LBqXOLvl1LAN8RMk9wuraxpVQeHSLydZKLhktMXCRY0iXII/s320/NewTrigger.png" width="320" /></a></li><li>On the “New Event Filter” window, click on the “XML” tab, and check the “Edit query manually” checkbox. <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEji41-Xmg-keK5RY2NUyGJ8Zk4uXrEK89gfrsf1ybQn3rzJ2FtdcmBP5MqqEPi8Qr7ruUonqDtaNinqraFKYe25q926bZqxMtj2Yhnr5G3pH_Cu-5lmIOr14ej1FduHtOalI8KoKIbjwVoq/s1600/NewEventFilter.png"><img border="0" data-original-height="551" data-original-width="543" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEji41-Xmg-keK5RY2NUyGJ8Zk4uXrEK89gfrsf1ybQn3rzJ2FtdcmBP5MqqEPi8Qr7ruUonqDtaNinqraFKYe25q926bZqxMtj2Yhnr5G3pH_Cu-5lmIOr14ej1FduHtOalI8KoKIbjwVoq/s320/NewEventFilter.png" width="315" /></a></li><li>You might get the following message: “If you choose to manually edit the query, you will no longer be able to modify the query using the controls on the Filter tab. Would you like to continue?” Click the “Yes” button to continue.</li><li>At this point you just need to enter the XPath of the event filter. This will be used to query the Windows Event. In my case, this is where I want to specify the trigger to start “SecondJob” when the “FirstJob” has been completed successfully. The XPath event filter that I was using: <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0ZdwzfV9jcCcPD5_1Y6mRNDi6yd8aXNoyIzziyYlMv6IGG6Mi9WpH_2D_vv3lgu8-PIyf1D-0YlIZho0Ws_-FXTl2l-Jr4GVojXqbuZ3ZfIJEIp9HY3ESz_Ybm5keBfQQMeXqkMgsed0k/s1600/EventFilterCode.png"><img border="0" data-original-height="551" data-original-width="543" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0ZdwzfV9jcCcPD5_1Y6mRNDi6yd8aXNoyIzziyYlMv6IGG6Mi9WpH_2D_vv3lgu8-PIyf1D-0YlIZho0Ws_-FXTl2l-Jr4GVojXqbuZ3ZfIJEIp9HY3ESz_Ybm5keBfQQMeXqkMgsed0k/s320/EventFilterCode.png" width="315" /></a></li></ul><p>The XPath event filter code:</p><pre class="language-xml"><code><QueryList>
<Query Id="0" Path="Microsoft-Windows-TaskScheduler/Operational">
<Select Path="Microsoft-Windows-TaskScheduler/Operational">*[EventData[@Name='ActionSuccess'][Data [@Name='TaskName']='\FirstJob']] and *[EventData[@Name='ActionSuccess'][Data [@Name='ResultCode']='0']]</Select>
</Query>
</QueryList>
</code></pre><p>You can adjust for the above code to your situation. The most likely would be the Task Name.</p><p><strong>Notes: </strong>In Windows 10, by default the Task History for Task Scheduler is disabled. In order to make this work, you will need to enable it. To do this you can go to the Task Scheduler, and without selecting any task, click on the “Action” on the top menu and then “Enable All Tasks History”. </p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-5757258353321819982019-01-15T14:01:00.003-05:002021-06-04T22:17:47.822-04:00Books in Review - 2018<div class="separator" style="clear: both; text-align: center;"><a href="https://commons.wikimedia.org/wiki/File:Great_Books.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="Books in Review" border="0" height="133" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtEOuZ_mcU42ONZjdZv4LmkXJNGhUY-PLWzyomyVBnIijcnkvDnMd-y_X8UiTvTW50v0jNb4fAhPch0DTLlV0SbGXwI-pPbujK_VbffICZZYZovKr2eI1iIFYNstfODk0Cylj47_wQmGw1/w200-h133/320px-Great_Books.jpg" style="background-image: none; display: inline;" title="Books in Review" width="200" /></a></div><p>Back in 2017, one of my personal goal for the year was to read more books. In 2017, I managed to read 25 books. For a complete list of books that I read in 2017, it can be found on <a href="http://www.lucasnotes.com/2018/01/books-in-review-2017.html">Books in Review - 2017</a>. I tried to keep it up in 2018. In 2018, I was able to read 25 books as well. There were a lot of good books to read.</p><a name='more'></a>
<h3>The Complete List</h3>
<p>Below are the complete list of books that I've read in 2018:</p>
<ol>
<li><a href="https://www.amazon.com/Ill-Give-You-Jandy-Nelson-ebook/dp/B007HUD5N0/ref=sr_1_2?ie=UTF8&qid=1547172011&sr=8-2&keywords=I%27ll+Give+You+the+Sun" target="_blank">I'll Give You the Sun</a>. Jandy Nelson. Dial Books. Fiction. 386 pages.</li>
<li><a href="https://www.amazon.com/How-Not-Be-Wrong-Mathematical-ebook/dp/B00G3L6JQ4/ref=sr_1_1?ie=UTF8&qid=1547172214&sr=8-1&keywords=How+Not+to+Be+Wrong%3A+The+Power+of+Mathematical+Thinking" target="_blank">How Not to Be Wrong: The Power of Mathematical Thinking</a>. Jordan Ellenberg. Penguin Books. Mathematics. 466 pages.</li>
<li><a href="https://www.amazon.com/Men-Without-Women-Haruki-Murakami-ebook/dp/B01KE64XTS/ref=sr_1_1?ie=UTF8&qid=1547172346&sr=8-1&keywords=Men+Without+Women%3A+Stories" target="_blank">Men Without Women: Stories</a>. Haruki Murakami. Vintage. Fiction. 242 pages.</li>
<li><a href="https://www.amazon.com/Fierce-Conversations-Achieving-Success-Conversation-ebook/dp/B000P28V2M/ref=sr_1_1?ie=UTF8&qid=1547172511&sr=8-1&keywords=Fierce+Conversations" target="_blank">Fierce Conversations: Achieving Success at Work and in Life One Conversation at a Time</a>. Susan Scott. Berkley. Communication. 365 pages.</li>
<li><a href="https://www.amazon.com/Make-Your-Bed-Little-Things-ebook/dp/B01KFJGT50/ref=sr_1_2?ie=UTF8&qid=1547172629&sr=8-2&keywords=Make+Your+Bed" target="_blank">Make Your Bed: Little Things That Can Change Your Life...And Maybe the World</a>. William H. McRaven. Grand Central Publishing. Self-Help. 125 pages.</li>
<li><a href="https://www.amazon.com/Pocket-Prayers-Simple-Bring-Peace-ebook/dp/B00KQ2G394/ref=sr_1_1?ie=UTF8&qid=1547172742&sr=8-1&keywords=Pocket+Prayers%3A+40+Simple+Prayers+that+Bring+Peace+and+Rest" target="_blank">Pocket Prayers: 40 Simple Prayers that Bring Peace and Rest</a>. Max Lucado. Thomas Nelson. Religion. 63 pages.</li>
<li><a href="https://www.amazon.com/Paper-Boats-Dee-Lestari-ebook/dp/B01LZOZXKX/ref=sr_1_1?s=books&ie=UTF8&qid=1547172917&sr=1-1&keywords=Paper+Boats" target="_blank">Paper Boats</a>. Dee Lestari. AmazonCrossing. Fiction. 400 pages.</li>
<li><a href="https://www.amazon.com/Horrible-Histories-Special-Twentieth-Century-ebook/dp/B00EIJI5MO/ref=sr_1_1?s=books&ie=UTF8&qid=1547173046&sr=1-1&keywords=Horrible+Histories%3A+20th+Century" target="_blank">Horrible Histories Special: Twentieth Century</a>. Terry Deary, Philip Reeve. Scholastic Non-Fiction. History. 176 pages.</li>
<li><a href="https://www.amazon.com/1st-Die-Womens-Murder-Club-ebook/dp/B000FA5Q2M/ref=sr_1_1?s=books&ie=UTF8&qid=1547173186&sr=1-1&keywords=1st+to+Die+%28Women%27s+Murder+Club%29" target="_blank">1st to Die (Women's Murder Club)</a>. James Patterson. Little, Brown and Company. Fiction. 488 pages.</li>
<li><a href="https://www.amazon.com/Everything-Store-Jeff-Bezos-Amazon-ebook/dp/B00BWQW73E/ref=sr_1_1?s=books&ie=UTF8&qid=1547173317&sr=1-1&keywords=The+Everything+Store" target="_blank">The Everything Store: Jeff Bezos and the Age of Amazon </a>. Brad Stone. Little, Brown and Company. Business & Money. 386 pages.</li>
<li><a href="https://www.amazon.com/Hagakure-Secret-Samurai-Yamamoto-Tsunetomo-ebook/dp/B00K9W25KW/ref=sr_1_1?s=books&ie=UTF8&qid=1547173564&sr=1-1&keywords=Hagakure" target="_blank">Hagakure: The Secret Wisdom of the Samurai</a>. Yamamoto Tsunetomo. Tuttle Publishing. History. 288 pages.</li>
<li><a href="https://www.amazon.com/Creativity-Inc-Overcoming-Unseen-Inspiration-ebook/dp/B00FUZQYBO/ref=sr_1_3?s=books&ie=UTF8&qid=1547173682&sr=1-3&keywords=Creativity%2C+Inc" target="_blank">Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration</a>. Ed Catmull, Amy Wallace. Random House. Business & Money. 368 pages.</li>
<li><a href="https://www.amazon.com/People-Problems-Crazy-Asians-Trilogy-ebook/dp/B01M09122V/ref=sr_1_1?s=books&ie=UTF8&qid=1547173836&sr=1-1&keywords=Rich+People+Problems" target="_blank">Rich People Problems</a>. Kevin Kwan. Anchor. Fiction. 418 pages.</li>
<li><a href="https://www.amazon.com/Everybody-Lies-Internet-About-Really-ebook/dp/B01AFXZ2F4/ref=sr_1_1?s=books&ie=UTF8&qid=1547173945&sr=1-1&keywords=Everybody+Lies%3A" target="_blank">Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are</a>. Seth Stephens-Davidowitz. Dey Street Books. Big Data. 357 pages.</li>
<li><a href="https://www.amazon.com/Magnolia-Story-Bonus-Content-ebook/dp/B01CH2EW20/ref=sr_1_1?s=books&ie=UTF8&qid=1547174091&sr=1-1&keywords=The+Magnolia+Story" target="_blank">The Magnolia Story</a>. Chip Gaines, Joanna Gaines, Mark Dagostino. Thomas Nelson. Biographies. 189 pages.</li>
<li><a href="https://www.amazon.com/Crazy-Rich-Asians-Trilogy-Book-ebook/dp/B00AP2VQEM/ref=sr_1_1?s=books&ie=UTF8&qid=1547174212&sr=1-1&keywords=Crazy+Rich+Asians" target="_blank">Crazy Rich Asians</a>. Kevin Kwan. Anchor. Fiction. 546 pages</li>
<li><a href="https://www.amazon.com/Ready-Player-One-Ernest-Cline-ebook/dp/B004J4WKUQ/ref=sr_1_1?s=books&ie=UTF8&qid=1547174323&sr=1-1&keywords=Ready+Player+One" target="_blank">Ready Player One</a>. Ernest Cline. Broadway Books. Fiction. 386 pages.</li>
<li><a href="https://www.amazon.com/China-Rich-Girlfriend-Asians-Trilogy-ebook/dp/B00NRQORA8/ref=sr_1_1?s=books&ie=UTF8&qid=1547174440&sr=1-1&keywords=China+Rich+Girlfriend" target="_blank">China Rich Girlfriend</a>. Kevin Kwan. Anchor. Fiction. 394 pages.</li>
<li><a href="https://www.amazon.com/Daemon-Daniel-Suarez-ebook/dp/B003QP4NPE/ref=sr_1_2?s=books&ie=UTF8&qid=1547174513&sr=1-2&keywords=Daemon" target="_blank">Daemon</a>. Daniel Suarez. Dutton. Fiction. 489 pages.</li>
<li><a href="https://www.amazon.com/Fantastic-Beasts-Where-Find-Them-ebook/dp/B01ETJABQK/ref=sr_1_4?s=books&ie=UTF8&qid=1547174601&sr=1-4&keywords=Fantastic+Beasts+and+Where+to+Find+Them" target="_blank">Fantastic Beasts and Where to Find Them: The Original Screenplay</a>. J. K. Rowling. Pottermore Publishing. Fiction. 304 pages.</li>
<li><a href="https://www.amazon.com/No-Middle-Name-Complete-Collected-ebook/dp/B01MTJQFQY/ref=sr_1_3?s=books&ie=UTF8&qid=1547174927&sr=1-3&keywords=No+Middle+Name%3A+The+Complete+Collected+Jack+Reacher+Short+Stories" target="_blank">No Middle Name: The Complete Collected Jack Reacher Short Stories</a>. Lee Child. Dell. Fiction. 476 pages.</li>
<li><a href="https://www.amazon.com/Harry-Potter-Chamber-Secrets-Rowling-ebook/dp/B0192CTMW8/ref=sr_1_1?s=books&ie=UTF8&qid=1547175026&sr=1-1&keywords=Harry+Potter+and+the+Chamber+of+Secrets" target="_blank">Harry Potter and the Chamber of Secrets</a>. J. K. Rowling. Pottermore Publishing. Fiction. 357 pages.</li>
<li><a href="https://www.amazon.com/Harry-Potter-Prisoner-Azkaban-Rowling-ebook/dp/B0192CTMX2/ref=sr_1_3?s=books&ie=UTF8&qid=1547175112&sr=1-3&keywords=Harry+Potter+and+the+Prisoner+of+Azkaban" target="_blank">Harry Potter and the Prisoner of Azkaban</a>. J. K. Rowling. Pottermore Publishing. Fiction. 475 pages.</li>
<li><a href="https://www.amazon.com/Art-Asking-Learned-Worrying-People-ebook/dp/B00IRISKD6/ref=sr_1_1?s=books&ie=UTF8&qid=1547175184&sr=1-1&keywords=The+Art+of+Asking%3A+How+I+Learned+to+Stop+Worrying+and+Let+People+Help" target="_blank">The Art of Asking: How I Learned to Stop Worrying and Let People Help</a>. Amanda Palmer. Grand Central Publishing. Biographies. 342 pages.</li>
<li><a href="https://www.amazon.com/Pivot-Only-Move-That-Matters-ebook/dp/B00ZQH2UCS/ref=sr_1_1?s=books&ie=UTF8&qid=1547175277&sr=1-1&keywords=Pivot%3A+The+Only+Move+That+Matters+Is+Your+Next+One" target="_blank">Pivot: The Only Move That Matters Is Your Next One</a>. Jenny Blake. Portfolio. Business & Money. 283 pages.</li>
</ol>
<h3>Fave Five</h3>
<p>There are a lot of good books that I read in 2017, below are top 5 books:</p>
<ul>
<li><a href="https://www.amazon.com/dp/ASIN/B01KFJGT50" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B01KFJGT50&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>Make Your Bed: Little Things That Can Change Your Life...And Maybe the World. William H. McRaven. Grand Central Publishing. Self-Help. 125 pages.</b> - This book is based on University of Texas Austin 2014 commencement speech by Admiral William H. McRaven. You can watch the speech <a href="https://www.youtube.com/watch?v=yaQZFhrW0fU" target="_blank">here</a>. Admiral William H. McRaven is a former Navy SEAL Commander. In in his book (and in his commencement speech), he gave 10 life lessons from his experience during Navy SEAL basic training. We can use these 10 lessons to change our life and possibly the world.</li>
<li><a href="https://www.amazon.com/dp/ASIN/B00FUZQYBO" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B00FUZQYBO&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration. Ed Catmull, Amy Wallace. Random House. Business & Money. 368 pages.</b> - Ed Catmull is one of the founders of Pixar. I love watching most of Pixar's movies and shorts. It's amazing that Pixar has been able to constantly deliver successful movies, such as Toy Story, Finding Nemo, Cars and The Incredibles. The book gives a lot of insights on Ed Catmull's thoughts and wisdom in managing creative company, Pixar cultures and some events that occurred, such as Steve Jobs interaction with Ed Catmull and Pixar and also changes that happened after Disney acquired Pixar. It was interesting to read how Ed Catmull try to preserve Pixar's culture and change Disney animation's cultures. The book contains a wealth of information for leaders, not only for leaders in creative industry, but also for leaders in other industries as well. </li>
<li><a href="https://www.amazon.com/dp/ASIN/B01AFXZ2F4" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B01AFXZ2F4&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are. Seth Stephens-Davidowitz. Dey Street Books. Big Data. 357 pages.</b> - If you like to read <a href="https://www.amazon.com/Freakonomics-Economist-Explores-Hidden-Everything/dp/0060731338" target="_blank">Freakonomics</a> and <a href="https://www.amazon.com/SuperFreakonomics-Cooling-Patriotic-Prostitutes-Insurance/dp/0060889586/ref=pd_lpo_sbs_14_t_1?_encoding=UTF8&psc=1&refRID=RG25ZV68R3FQZTNAS284" target="_blank">SuperFreakonomics</a> by Steven D. Levitt and Stephen J. Dubner, you will most likely enjoy reading this book. Similar to Freakonomics and SuperFreakonomics, this book tries to make sense (meaningful information) from data collected. It is interesting to learn that people lies on surveys (even anonymous surveys) and, not surprising, people turn to Google/ Internet search for answers. It is fascinating to know that you can actually learn a lot of things from people's Google search terms and suggestions. </li>
<li><a href="https://www.amazon.com/dp/ASIN/B01CH2EW20" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B01CH2EW20&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>The Magnolia Story. Chip Gaines, Joanna Gaines, Mark Dagostino. Thomas Nelson. Biographies. 189 pages.</b> - Chip and Joanna Gaines are husband and wife who starred on HGTV's Fixer Upper show. I love watching the Fixer Upper. The show was about finding houses with potential. Chip and Joanna Gaines would remodelled those houses into dream homes for their clients. The book would tells their story. How Chip and Joanna met each other. How they got that unexpected call from HGTV to do the Fixer Upper show. How they are different from each other but complete each other. The book would also document their struggles and success. I enjoyed reading the book. It was entertaining and also inspiring.</li>
<li><a href="https://www.amazon.com/dp/ASIN/B00AP2VQEM" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B00AP2VQEM&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>Crazy Rich Asians. Kevin Kwan. Anchor. Fiction. 546 pages</b> - The main characters in the novels are Nick Young and Rachel Chu. Both of them are NYU professors and they are in love with each other. Nick is from Singapore, while Rachel is from California. Nick decided to take Rachel to Singapore to meet his family. Unbeknownst to Rachel, Nick came from ultra rich family. The novel has almost everything, from romance, drama to humor. The novel was fun to read. The novel is part of a trilogy. After this novel, we have <a href="https://www.amazon.com/dp/ASIN/B00NRQORA8" target="_blank">China Rich Girlfriend</a> and <a href="http://www.amazon.com/dp/ASIN/B01M09122V" target="_blank">Rich People Problems</a>.</li>
</ul>
Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-19504542600899062672019-01-04T12:20:00.006-05:002021-06-08T22:14:33.678-04:00Executing xp_cmdshell with Non SysAdmin Account<div class="separator" style="clear: both; text-align: center;"><a href="http://thebluediamondgallery.com/tablet-dictionary/a/alternative.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="Image by The Blue Diamond Gallery" border="0" height="133" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7ZU26gGqSDx_RApGptYQA5ET5ZUU0Nugi5Ym_rKtVNECa99hd5j7v3ck7gCZvpTsYiIeNfbyL3_80jqIi3rph5_bru3N5YRmJsvTrbKGc60IRzM2kSothYFYYkQSqgq4ym2Cvv6igAuhP/w200-h133/alternative.jpg" style="background-image: none; display: inline;" title="Image by The Blue Diamond Gallery" width="200" /></a></div><p>In SQL Server, xp_cmdshell is an extended stored procedure that is stored in the master database that would allow us to execute Windows command shell. By default, the ability to execute xp_cmdshell is disabled. We could enable the ability to run xp_cmdshell by executing the following query against the SQL Server instance:</p><a name='more'></a><br /><br /><pre><code class="language-sql;">USE master;
GO
EXEC sp_configure 'show advanced options', 1;
reconfigure;
GO
EXEC sp_configure 'xp_cmdshell', 1;
reconfigure;
GO
</code></pre><p>Once we run the above T-SQL query, any account that is part of the sysadmin role in the SQL Server instance has the ability to run the xp_cmdshell extended stored procedure. On the background, when the user with sysadmin privileges runs the xp_cmdshell, it will execute the Windows command shell using the SQL Server Service Account (So if you are executing xp_cmdshell to access certain resource on the network, for example, and you are having permission issue, you might want to make sure that the SQL Server Service Account has permission to that resource).</p><p>Now, what if you have a non-sysadmin account that needs to run xp_cmdshell? In order to do that, we would need to do some additional configuration. For this example, let say that we want to grant SQL Server login 'johndoe' the ability to execute xp_cmdshell extended stored procedure (<b>Notes:</b> It doesn’t have to be SQL Server login, it can also be assigned to Windows authentication). And we want to execute the command passed to xp_cmdshell using account Domain\RunUser Windows account.</p><p>First, we would need to create a cmd shell proxy account on the SQL Server Instance. This will essentially tell SQL Server to use the proxy account when executing the Windows command shell. To do that we can run the following T-SQL statement:</p><pre class="language-sql;"><code>USE master;
GO
IF NOT EXISTS (SELECT 1 FROM sys.credentials WHERE [name] = '##xp_cmdshell_proxy_account##')
BEGIN
CREATE CREDENTIAL ##xp_cmdshell_proxy_account## WITH IDENTITY = 'Domain\RunUser', SECRET = 'PasswordForTheRunUser'
END
GO
</code></pre><p>Next, we would need to grant the non-sysadmin account (in this case the SQL Server login 'johndoe'), access to master database and the execute permission for the xp_cmdshell extended stored procedure. To do this, we need to execute the following T-SQL Statement:</p>
<pre class="language-sql"><code>USE [master];
GO
EXEC sp_grantdbaccess 'johndoe';
GO
GRANT EXEC ON xp_cmdshell TO johndoe;
GO
</code></pre><p>Now, if the proxy account that we use (in this case, the domain account 'Domain\RunUser') doesn’t have local administration to the Windows Server where the SQL Server Instance is installed, we would need to grant it "Logon As Batch Job" privilege on the "Local Security Policy". To do that, we can go to "Local Security Policy" and then go to "User Rights Assignment" and look for "Logon As Batch Job". Add 'Domain\RunUser' to the "Logon As Batch Job" policy. If we don’t do this, we would most likely get the following error when 'johndoe' SQL Server login account tries to execute xp_cmdshell:</p><blockquote>Msg 15121, Level 16, State 200, Procedure xp_cmdshell, Line 1<br />An error occurred during the execution of xp_cmdshell. A call to 'LogonUserW' failed with error code: '1385'.</blockquote><p>A quick way to test, you can run the following T-SQL statement:</p><pre class="language-sql"><code>
EXEC AS LOGIN = 'johndoe';
EXEC xp_cmdshell 'whoami';
REVERT;
</code>
</pre><p>The above T-SQL statement would run the xp_cmdshell 'whoami' by impersonating 'johndoe' SQL Server login. If things are set up properly, you should see 'Domain\RunUser' as the result.</p><h3>To Clean Up or Revert Back:</h3><p>First, we would remove 'Domain\RunUser' from the "Local Security Policy" - "User Rights Assignment" - "Logon As Batch Job".</p><p>Then, we would run the following T-SQL statement to revoke access for 'johndoe' user:</p><pre class="language-sql"><code>USE [master];
GO
REVOKE EXEC ON xp_cmdshell TO johndoe;
GO
EXEC sp_revokedbaccess 'johndoe';
GO
</code></pre><p>Lastly, we would run the following T-SQL statement to remove the proxy account:</p><pre class="language-sql"><code>USE [master];
GO
EXEC sp_xp_cmdshell_proxy_account NULL;
GO
</code></pre><h3>Refrences</h3><ul><li>xp_cmdshell <a href="https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/xp-cmdshell-transact-sql?view=sql-server-2017" target="_blank">https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/xp-cmdshell-transact-sql?view=sql-server-2017</a></li><li>Troubleshooting xp_cmdshell failures <a href="https://blogs.msdn.microsoft.com/psssql/2008/04/10/troubleshooting-xp_cmdshell-failures/" target="_blank">https://blogs.msdn.microsoft.com/psssql/2008/04/10/troubleshooting-xp_cmdshell-failures/</a></li></ul>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-86523379628285641772018-02-21T17:03:00.002-05:002021-05-28T20:50:11.583-04:00Windows 10 – Mobile Hotspot<div class="separator" style="text-align: center; clear: both;"><a style="clear: left; margin-right: 1em; margin-bottom: 1em; float: left;" href="https://pixabay.com/en/windows-10-laptop-screen-wallpaper-1535765/"><img title="Image by barek2marcin via pixabay" style="display: inline; background-image: none;" border="0" alt="Image by barek2marcin via pixabay" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWoEoAodAbOboetMN03bry5fjLsk5UiY_Dte6zzDYRp0X0_S_km6iIqrafsuRF-DsRIk7pCJYNPxAPKK88we9CpJLFFbrDQrgWASTf9nnGxyaGAUD76VPvAWSwV7t8ztp_mC8z-qWgne4V/?imgmax=800" width="244" height="154" /></a></div> <p>A few weeks ago, I was doing a demo on a couple of Alexa skills that I've developed using <a href="http://amzn.to/2Fj6dkK" target="_blank">Amazon Echo Dot</a>. However, the place where I have to do the demo use enterprise class Wi-Fi (it is where after connecting to the Wi-Fi network, we need to authenticate using web browser). Unfortunately, at the moment, as far as I know, Amazon Echo Dot doesn't support enterprise class Wi-Fi. To make matter worse, my cell phone signal was very weak in the meeting room, so I was unable to use my cell phone as hotspot.</p><a name='more'></a> <p>Luckily, I have my Windows 10 laptop. Windows 10 has a feature where I can turn my laptop into mobile hotspot. To make it work, I would make sure that my laptop has Internet connection. Before the demo, I would connect my laptop to the Wi-Fi network (and when prompted, I would authenticate myself using the username and password provided by the company). Once I am connected to the Wi-Fi network and have Internet connection, I would then:</p> <ol> <li> On my Windows 10 laptop, go to “<strong>Settings</strong>”. </li> <li>Then I would select “<strong>Network & Internet</strong>”. </li> <li>On the left navigation area of the “<strong>Network & Internet</strong>” window, there should be an option for “<strong>Mobile hotspot</strong>”. I would click on that.</li> <li>On the “<strong>Mobile hotspot</strong>” screen, you should see a screen similar to the one below. By default, Windows would already pre-define the mobile hotspot network (SSID) name and password for you. However, you can change them by using the “<strong>Edit</strong>” button. To turn on the mobile hotspot, near the top of the screen, there is an on/off toggle button, click on that one and it will enable the mobile hotspot.</li> </ol> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7QKy9zMYQCy6Q0cr0WfAOgpXFUULMSdpMTyCg9vioUC6M5aZ0t02CXSPwr5O9j3mW8bDpXLGdoBmVpwIzYrLGGuZj1YsVxiXQaGGpUas0ENlBmcSoCt4KItwQ_SteCeviAaisgOpiqBaL/s1600-h/W10MobileHotspot%255B3%255D"><img title="W10MobileHotspot" style="margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" border="0" alt="W10MobileHotspot" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCbScWFzXgV5PvehCx5N_5dlej1qZfS59iCTCqB4sb-zvq4vC674OsdcmewzVZsC56uesgatC3Z5XHuPVuVutXAYbuRGc78m-1YNrsVntVrcWc0dn3UlxMP3zQ0Mimsqsjy9zj5nAU_IeZ/?imgmax=800" width="644" height="436" /></a></p> <p>At this point, you can go to your other device (in my case Alexa app on my mobile phone to configure the Amazon Echo Dot) and configure it to connect to the mobile hotspot that has just been setup on the Windows 10 system.</p> <p>In my case, the mobile hotspot feature in Windows 10 works great. I was able to demo Alexa skills that I’ve developed on Amazon Echo Dot without any issue on places that use enterprise class Wi-Fi. </p> <h3>Some Observations</h3> <ul> <li>Windows 10 seems to be smart enough to detect if the system has wireless (Wi-Fi) capability or not. If the system does not have Wi-Fi card, I would not see the “<strong>Mobile hotspot</strong>” option under “<strong>Settings</strong>” – “<strong>Network & Internet</strong>”.</li> <li>When I am using VPN connection, I might not be able to enable the mobile hotspot feature</li> <li>After some period of inactivity on the system, Windows 10 would automatically disabled the mobile hotspot.</li> </ul> <ol></ol>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-67232611369406571762018-02-13T01:23:00.001-05:002021-05-28T20:51:34.804-04:00T-SQL Tuesday #99 – Work-Life Balance<div class="separator" style="text-align: center; clear: both;"><a style="clear: left; margin-right: 1em; margin-bottom: 1em; float: left;" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s1600/sqltuesday.png" imageanchor="1"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s200/sqltuesday.png" width="99" height="100" data-original-width="129" data-original-height="130" /></a></div> <p>This month T-SQL Tuesday is hosted by Aaron Bertrand (<a href="https://sqlblog.org" target="_blank">blog</a> | <a href="https://twitter.com/AaronBertrand" target="_blank">twitter</a>). For this month's T-SQL Tuesday, we are given two options. The first option is to write about something that we are passionate about outside of SQL and tech community. The second option is to write about T-SQL bad habit. Mr. Bertrand has <a href="https://blogs.sentryone.com/aaronbertrand/bad-habits-revival/" target="_blank">a long list of them</a>. For this entry, I choose to write about activities that I do outside SQL or tech community <a href="https://twitter.com/hashtag/sqlibrium?src=hash&lang=en" target="_blank">#sqlibrium</a>.</p><a name='more'></a> <h3>Cooking</h3> <p>One activity that help me to decompress from a busy week at work is cooking. By no means, that I am a great, or even a good, cook. I love to research recipes online and try them. I would also try to experiments with the recipes, tweaking, substituting or adding ingredients. Cooking allows me to channel my inner creativity. Sure, there were times where the “experiments” end up to be disasters (plenty of times actually). Those times, I ended up ordering and eating take-out Pizza or Chinese food. But hey, Edison failed 9,999 times before he was successful in inventing the light bulb.</p> <p>It also helps to cook something that you love eating. It will give you that extra motivation. One example, I love eating ramen noodle (Not the instant one. Although, I do get instant ramen fix once in a while). Recently, I embarked on ramen noodle making journey. After reading books, blogs, watching YouTube (I think you can almost learn about anything on YouTube nowadays) and attending a weekend class on "The Art of Ramen”, for several weeks I tried to experiments making ramen noodles. Earlier on, it was not pretty. But after several attempts, I was able to make Miso Ramen. Some of the pictures from one of the successful experiment:</p> <table cellspacing="0" cellpadding="0" width="100%" border="0"><tbody> <tr> <td valign="top"><img title="Fresh Ramen Noodles" style="border: 0px currentcolor; border-image: none; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" border="0" alt="Fresh Ramen Noodles" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhw4Iaz0IkMyHlu0fosEBD2CCXRH6kz-meUEooUTcCzvjb3eqcsk-squq1BfrBjOzpgZCJDpyA-fnUz6CCixG9QrF9Q-knxWBEmLpIgp6cOER0XHX3BZZuJwCQgGrOAMKNzyZb4hdQAZc7j/?imgmax=800" width="380" height="500" /></td> </tr> <tr> <td valign="top"> <p align="center"><em>Fresh Ramen Noodles</em></p> </td> </tr> </tbody></table> <table cellspacing="0" cellpadding="0" width="100%" border="0"><tbody> <tr> <td valign="top"><img title="Ramen with Pork and Egg" style="border: 0px currentcolor; border-image: none; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" border="0" alt="Ramen with Pork and Egg" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZq4mHYYjrZSrRiRhopXKiOlxmbuYda1zL1OsfnraE6Jh3OCICroHO6G2lT_5ulGoXEvuTJRnMMC-7NLA0iwmitCPsj5PHK_XFfHZUWpRWSTyEpI6odccFrbmQH3CPvO8xCH9yTDzPmq06/?imgmax=800" width="380" height="500" /></td> </tr> <tr> <td valign="top"> <p align="center"><em>Ramen with Pork and Egg</em></p> </td> </tr> </tbody></table> <table cellspacing="0" cellpadding="0" width="100%" border="0"><tbody> <tr> <td valign="top"><img title="Miso Ramen" style="border: 0px currentcolor; border-image: none; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" border="0" alt="Miso Ramen" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJU0cceoKz70iVY7zpdhZP2QZNG9LOzYPe3oCcH_dlRxhkIYCMAbn5iBY2W17MwMvo0Jq59lcEyNUZxkUD7yMCx1uZsxd3llhsAq6yJkaGQrvZYe8AgqufYCvnlMS-ZB78nYhdDWj9AlFH/?imgmax=800" width="660" height="500" /></td> </tr> <tr> <td valign="top"> <p align="center"><em>Miso Ramen</em></p> </td> </tr> </tbody></table> <p>It took a many failed attempts and several weeks. But I must say, success taste delicious (especially on cold winter days in New England).</p> <h3>Exercise</h3> <p>Health is very important. During the week, I try to run a few miles three times a week. I love running because when I am running, I am alone with my thoughts. There are times where I got great ideas during my runs. In addition to running, I also try to learn to play ice hockey. Mostly, I play in pick up games. It is a fun and great workout.</p> <p>Last year, I got the opportunity to join a dragon boat team. We practiced and raced in 38th annual <a href="http://www.bostondragonboat.org/" target="_blank">Boston dragon boat festival</a> and 19th annual <a href="http://www.dragonboatri.com/site/" target="_blank">Rhode Island Chinese Dragon Boat Races and Taiwan Day Festival</a>. We had a great coach and also a great group of people for teammates. We encouraged and build one another up. Just like any other team sport, team work is very important. It is not enough for each individual to have the paddling technique correct; the team need to be in sync as well. The practices were hard (there were lots of pains and aches in the first couple of practices. I felt aches on muscles that I didn’t even know existed), but I really enjoyed them. Our practice sessions were in Charles River. We have a great view of the city from the river. Most importantly, I enjoy the camaraderie.</p> <table cellspacing="0" cellpadding="0" width="100%" border="0"><tbody> <tr> <td valign="top"><a href="https://www.facebook.com/asianamericancivicassociation/"><img title="AACA Dragon Boat Team" style="border: 0px currentcolor; border-image: none; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" border="0" alt="AACA Dragon Boat Team" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiF5pYJP-Tk9pIZHMo2QjI4JopieSPWms7A4yVrkc6YY0vhfrV5SjZuaRpvpEZypMBz08fDmuQIr-8j4Zd84gbBwzI-_OXPoCXQtpqG_UXZiGbxbqLBszjn06yHvd7kqIoyVzrVF-hkuusH/?imgmax=800" width="644" height="288" /></a></td> </tr> <tr> <td valign="top"> <p align="center"><em>Source: AACA</em></p> </td> </tr> </tbody></table> <h3>How About You?</h3> <p>Would love to hear from others in the SQL communities on their activities outside SQL or tech community.</p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-73562661094587416402018-01-04T18:00:00.002-05:002021-06-09T21:49:18.293-04:00Books in Review - 2017<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_62RIO21IjEq2e-QfLfEnxSYIiFyZQ6mcEWEJrWCoXlgd3HQSLqfL-Dfr0Ihn-bGnMkaX9kDQAhgQCgwaMpyFP63TEbVeW0RvVy90-Qw90-05zth-0yGjNnp65nqMdxBIkI4LQRnBYWzz/s1600/books.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="Book List" border="0" data-original-height="133" data-original-width="200" height="133" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_62RIO21IjEq2e-QfLfEnxSYIiFyZQ6mcEWEJrWCoXlgd3HQSLqfL-Dfr0Ihn-bGnMkaX9kDQAhgQCgwaMpyFP63TEbVeW0RvVy90-Qw90-05zth-0yGjNnp65nqMdxBIkI4LQRnBYWzz/w200-h133/books.jpg" title="Book List" width="200" /></a></div><p>One of my personal goal for 2017 was to read more books. I was gunning to read 2 books a month. In 2017, I read 25 books (1 more book than the yearly target goal). I am not a fast reader. Normally, I read books when I commute to and from work. Most of the books were rented from local library. I used to prefer hard copy books than digital books. However, a couple of years ago, for non technical books, I made the transition to get digital (Kindle) books whenever possible. It felt awkward at first, but after reading several books on my kindle, I am starting to get used to it. The good thing is that my local library also has a good selection of books in digital format. Since 2017 has just ended, I think it would be good to inventory the books that I've read in 2017. It seems that I read various genres of books from fiction, self-help, autobiography, etc.</p><a name='more'></a>
<h3>The Complete List</h3>
<p>Below are the complete list of books that I've read in 2017:</p>
<ol>
<li><a href="https://www.amazon.com/gp/product/0066620996/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0066620996&linkCode=as2&tag=105507-20&linkId=4c6cb5f2124898d7fb16ea1a11f964a8" target="_blank">Good to Great: Why Some Companies Make the Leap and Others Don't</a>. Jim Collins. HarperBusiness. Business & Money. 400 pages.</li>
<li><a href="https://www.amazon.com/Brain-Rules-Updated-Expanded-Principles-ebook/dp/B00JNYEXAM/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514951828&sr=1-1&keywords=Brain+Rules:+12+Principles+for+Surviving+and+Thriving+at+Work,+Home+and+School&linkCode=ll1&tag=105507-20&linkId=616c76aad9939e31ecf46c43d1052215" target="_blank">Brain Rules (Updated and Expanded): 12 Principles for Surviving and Thriving at Work, Home, and School</a>. John Medina. Pear Press. Self Help. 306 pages.</li>
<li><a href="https://www.amazon.com/Extreme-Ownership-U-S-Navy-SEALs-ebook/dp/B0739PYQSS/ref=as_li_ss_tl?ie=UTF8&qid=1515034358&sr=8-2&keywords=extreme+ownership&linkCode=ll1&tag=105507-20&linkId=e0198f2dba2d7dcee3751435f1cf5de6" target="_blank">Extreme Ownership: How U.S. Navy SEALs Lead and Win</a>. Jocko Willink & Leif Babin. St. Martin's Press. Self Help. 317 pages.</li>
<li><a href="https://www.amazon.com/Girl-Seven-Names-Korean-Defectors-ebook/dp/B00JD3ZL9U/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514952132&sr=1-1&keywords=The+Girl+with+Seven+Names&linkCode=ll1&tag=105507-20&linkId=6140c7ca61f5f94ba451d3405c50c3f9" target="_blank">The Girl with Seven Names: A North Korean Defector's Story</a>. Hyeonseo Lee & David John. HarperCollins Publishers. Autobiography. 320 pages.</li>
<li><a href="https://www.amazon.com/Marathon-Ultimate-Training-Programs-Marathons-ebook/dp/B005LP9GTK/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514952446&sr=1-1&keywords=Marathon:+The+Ultimate+Training+Guide:+Advice,+Plans,+and+Programs+for+Half+and+Full+Marathons&linkCode=ll1&tag=105507-20&linkId=ce70533655e018bbd2db4a3fbbc2ceb2" target="_blank">Marathon: The Ultimate Training Guide: Advice, Plans, and Programs for Half and Full Marathons</a>. Hal Higdon. Rodale. Health, Fitness & Dieting. 306 pages.</li>
<li><a href="https://www.amazon.com/Confessions-Unlikely-Runner-Averagely-Dedicated-ebook/dp/B0754MYHML/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514952583&sr=1-1&keywords=Confessions+of+an+Unlikely+Runner:+A+Guide+to+Racing+and+Obstacle+Courses+for+the+Averagely+Fit+and+Halfway+Dedicated&linkCode=ll1&tag=105507-20&linkId=5381abcdf3813b644a0daea4218f4218" target="_blank">Confessions of an Unlikely Runner: A Guide to Racing and Obstacle Courses for the Averagely Fit and Halfway Dedicated</a>. Dana Ayers. GP Press. Health, Fitness & Dieting. 169 pages.</li>
<li><a href="https://www.amazon.com/Night-School-Jack-Reacher-Novel-ebook/dp/B019B6WTRU/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514952770&sr=1-1&keywords=Night+School:+A+Jack+Reacher+Novel&linkCode=ll1&tag=105507-20&linkId=fc236e22a3a8940ff6bd0e8d7b1794c2" target="_blank">Night School: A Jack Reacher Novel</a>. Lee Child. Delacorte Press. Fiction. 498 pages.</li>
<li><a href="https://www.amazon.com/100-Startup-Reinvent-Living-Create-ebook/dp/B0067TGSOK/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514952856&sr=1-1&keywords=The+$100+Startup:+Reinvent+the+Way+You+Make+a+Living,+Do+What+You+Love,+and+Create+a+New+Future&linkCode=ll1&tag=105507-20&linkId=e526866665b144478c1aafd7c588cb7d" target="_blank">The $100 Startup: Reinvent the Way You Make a Living, Do What You Love, and Create a New Future</a>. Chris Guillebeau. Currency. Business & Money. 304 pages.</li>
<li><a href="https://www.amazon.com/Bringing-Home-Birkin-Pursuit-Coveted-ebook/dp/B0010SGR26/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514953064&sr=1-1&keywords=Bringing+Home+the+Birkin:+My+Life+in+Hot+Pursuit+of+the+World's+Most+Coveted+Handbag&linkCode=ll1&tag=105507-20&linkId=84a8f36ad77a06f3f7baa7be3eb86c07" target="_blank">Bringing Home the Birkin: My Life in Hot Pursuit of the World's Most Coveted Handbag</a>. Michael Tonello. HarperCollins Publisher. Autobiography. 275 pages.</li>
<li><a href="https://www.amazon.com/Bright-Line-Eating-Science-Living-ebook/dp/B01MUA6QAX/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1514953261&sr=1-1&keywords=Bright+Line+Eating:+The+Science+of+Living+Happy,+Thin+&+Free&linkCode=ll1&tag=105507-20&linkId=b91bb8eca954560a2865ddfffcfcf6d6" target="_blank">Bright Line Eating: The Science of Living Happy, Thin & Free</a>. Susan Peirce Thompson Ph.D. Hay House. Health, Fitness & Dieting. 325 pages.</li>
<li><a href="https://www.amazon.com/Eleven-Seconds-Tragedy-Courage-Triumph-ebook/dp/B001N2ZWVQ/ref=as_li_ss_tl?ie=UTF8&qid=1515018745&sr=8-1&keywords=Eleven+Seconds:+A+Story+of+Tragedy,+Courage+&+Triumph&linkCode=ll1&tag=105507-20&linkId=ede2e1618a602c9be4149de2b49fa576" target="_blank">Eleven Seconds: A Story of Tragedy, Courage & Triumph</a>. Travis Roy & E. M. Swift. Grand Central Publishing. Autobiography. 226 pages.</li>
<li><a href="https://www.amazon.com/Universe-Has-Your-Back-Transform-ebook/dp/B01GUIL13K/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515018919&sr=1-1&keywords=The+Universe+Has+Your+Back:+Transform+Fear+to+Faith&linkCode=ll1&tag=105507-20&linkId=a7b71914a166491a1bc34703ae3473d7" target="_blank">The Universe Has Your Back: Transform Fear to Faith</a>. Gabrielle Bernstein. Hay House, Inc. Self Help. 205 pages.</li>
<li><a href="https://www.amazon.com/Harry-Potter-Sorcerers-Stone-Rowling-ebook/dp/B0192CTMYG/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019043&sr=1-4&keywords=Harry+Potter+and+the+Sorcerer's+Stone&linkCode=ll1&tag=105507-20&linkId=6b6142fca47a7f9ccd63936cefb0b7e9" target="_blank">Harry Potter and the Sorcerer's Stone</a>. J.K. Rowling. Pottermore from J.K. Rowling. Fiction. 322 pages.</li>
<li><a href="https://www.amazon.com/Abraham-Lincoln-Lessons-Spiritual-Leadership-ebook/dp/B009AL0B0E/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019264&sr=1-1&keywords=Abraham+Lincoln:+Lessons+in+Spiritual+Leadership&linkCode=ll1&tag=105507-20&linkId=350117ca39ac8f59923ecfc62f63b2c1" target="_blank">Abraham Lincoln: Lessons in Spiritual Leadership</a>. Elton Trueblood . HarperOne. Autobiography. 195 pages.</li>
<li><a href="https://www.amazon.com/Titans-Curse-Percy-Jackson-Olympians-ebook/dp/B00280LYI2/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019414&sr=1-1&keywords=Titan's+Curse,+The+(Percy+Jackson+and+the+Olympians,+Book+3)&linkCode=ll1&tag=105507-20&linkId=56d756d9cc46836196dfe722fa9ab0dc" target="_blank">Titan's Curse, The (Percy Jackson and the Olympians, Book 3)</a>. Rick Riordan. Disney Hyperion. Fiction. 322 pages.</li>
<li><a href="https://www.amazon.com/Whistler-Novel-24-John-Grisham-ebook/dp/B01C1LUFFK/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019545&sr=1-1&keywords=The+Whistler&linkCode=ll1&tag=105507-20&linkId=30607c629311ee3ddced460448cbba51" target="_blank">The Whistler</a>. John Grisham. Doubleday. Fiction. 386 pages.</li>
<li><a href="https://www.amazon.com/You-Are-Badass-Doubting-Greatness-ebook/dp/B00B3M3VWS/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019678&sr=1-1&keywords=You+Are+a+Badass:+How+to+Stop+Doubting+Your+Greatness+and+Start+Living+an+Awesome+Life&linkCode=ll1&tag=105507-20&linkId=9eaa895e7e11bff7fa3ffe0df4d2cee3" target="_blank">You Are a Badass: How to Stop Doubting Your Greatness and Start Living an Awesome Life</a>. Jen Sincero. Running Press. Self Help. 258 pages.</li>
<li><a href="https://www.amazon.com/When-Breath-Becomes-Paul-Kalanithi-ebook/dp/B00XSSYR50/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019802&sr=1-1&keywords=when+breath+becomes+air&linkCode=ll1&tag=105507-20&linkId=3d8ff8dd8607077b0a5a5de6b0f4854d" target="_blank">When Breath Becomes Air</a>. Paul Kalanithi & Abraham Verghese. Random House. Autobiography. 258 pages.</li>
<li><a href="https://www.amazon.com/Art-Saying-NO-Reclaim-Granted-ebook/dp/B074LZG7KS/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515019931&sr=1-1&keywords=The+Art+Of+Saying+NO:+How+To+Stand+Your+Ground,+Reclaim+Your+Time+And+Energy,+And+Refuse+To+Be+Taken+For+++Granted&linkCode=ll1&tag=105507-20&linkId=19b8dbc9ec62cbffa4bfef56ddb91910" target="_blank">The Art Of Saying NO: How To Stand Your Ground, Reclaim Your Time And Energy, And Refuse To Be Taken For Granted (Without Feeling Guilty!)</a>. Damon Zahariades. Amazon Digital Services LLC. Self Help. 172 pages.</li>
<li><a href="https://www.amazon.com/How-Fail-Almost-Everything-Still-ebook/dp/B00COOFBA4/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020039&sr=1-1&keywords=How+to+Fail+at+Almost+Everything+and+Still+Win+Big:+Kind+of+the+Story+of+My+Life&linkCode=ll1&tag=105507-20&linkId=911dd188a8661ffab06feac40b55f265" target="_blank">How to Fail at Almost Everything and Still Win Big: Kind of the Story of My Life</a>. Scott Adams. Portfolio. Autobiography. 247 pages.</li>
<li><a href="https://www.amazon.com/Life-Changing-Magic-Tidying-Decluttering-Organizing-ebook/dp/B00KK0PICK/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020191&sr=1-1&keywords=The+Life-Changing+Magic+of+Tidying+Up:+The+Japanese+Art+of+Decluttering+and+Organizing&linkCode=ll1&tag=105507-20&linkId=9092f8e5550783ccb5a7c3b927bae043" target="_blank">The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing</a>. Marie Kondo. Ten Speed Press. Self Help. 226 pages.</li>
<li><a href="https://www.amazon.com/Art-Communicating-Thich-Nhat-Hanh-ebook/dp/B00BATKO2E/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020294&sr=1-1&keywords=The+Art+of+Communicating&linkCode=ll1&tag=105507-20&linkId=395f7da5673a666fde0c858abee210e6" target="_blank">The Art of Communicating</a>. Thich Nhat Hanh. HarperOne. Self Help. 179 pages.</li>
<li><a href="https://www.amazon.com/Option-Adversity-Building-Resilience-Finding-ebook/dp/B01N8R5QD7/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020394&sr=1-1&keywords=Option+B&linkCode=ll1&tag=105507-20&linkId=0fc7e1b127e8f4a1ff2e1682ada095f2" target="_blank">Option B: Facing Adversity, Building Resilience, and Finding Joy</a>. Sheryl Sandberg & Adam Grant. Knopf. Self Help. 242 pages.</li>
<li><a href="https://www.amazon.com/Hobbit-Lord-Rings-J-R-R-Tolkien-ebook/dp/B0079KT81G/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020525&sr=1-1&keywords=the+hobbit&linkCode=ll1&tag=105507-20&linkId=4a24b2e5c5185fe8b2e6ae78e120eb07" target="_blank">The Hobbit (Lord of the Rings)</a>. J.R.R. Tolkien. Houghton Mifflin Harcourt. Fiction. 235 pages.</li>
<li><a href="https://www.amazon.com/Elon-Musk-SpaceX-Fantastic-Future-ebook/dp/B00KVI76ZS/ref=as_li_ss_tl?s=digital-text&ie=UTF8&qid=1515020659&sr=1-1&keywords=elon+musk&linkCode=ll1&tag=105507-20&linkId=615bddcb7abfea4ee16a433e9908b8f6" target="_blank">Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future</a>. Ashlee Vance. Ecco. Autobiography. 374 pages.</li>
</ol>
<h3>Fave Five</h3>
<p>Here are the five memorable books:</p>
<ul>
<li><a href="https://www.amazon.com/Girl-Seven-Names-Korean-Defectors-ebook/dp/B00JD3ZL9U/ref=as_li_ss_il?ie=UTF8&qid=1515038199&sr=8-1&keywords=The+Girl+with+Seven+Names:+A+North+Korean+Defector's+Story&linkCode=li1&tag=105507-20&linkId=58141ac765021f8e55bb3f00a7ac42ff" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B00JD3ZL9U&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00JD3ZL9U" style="border: none; margin: 0px;" width="1" /><b>The Girl with Seven Names: A North Korean Defector's Story. Hyeonseo Lee & David John. HarperCollins Publishers. Autobiography. 320 pages.</b> - The book provides some insights on what it was like living in North Korea. It also tells the story of Hyeonseo Lee's amazing journey and struggles to get out of North Korea into China and then South Korea. Once out of North Korea, she has to conceal her identity that she was a North Korean, since she could be arrested and deported back to North Korea and possibly executed in North Korea. In addition, she has to worried about her mother and her brother in North Korea as well, since if the North Korean government knew that a family member has defected, the rest of the family that are still in North Korea could experience hardship. Even with the possibility of hardship, leaving North Korea was a hard decision since they would leave their home, family and friend behind onto a new uncertain future. It is a great read.</li>
<li><a href="https://www.amazon.com/Eleven-Seconds-Tragedy-Courage-Triumph-ebook/dp/B001N2ZWVQ/ref=as_li_ss_il?s=digital-text&ie=UTF8&qid=1515039473&sr=1-1&keywords=Eleven+Seconds:+A+Story+of+Tragedy,+Courage+&+Triumph&linkCode=li1&tag=105507-20&linkId=cda4a7bc524133f0a133991109c2100c" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B001N2ZWVQ&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B001N2ZWVQ" style="border: none; margin: 0px;" width="1" /><b>Eleven Seconds: A Story of Tragedy, Courage & Triumph. Travis Roy & E. M. Swift. Grand Central Publishing. Autobiography. 226 pages.</b> - This is a very inspirational book. It is the life story of Travis Roy. He was an ice hockey player. He got a chance to realize his dream of playing for Boston University (Division I Hockey Club). However, 11 seconds into his first shift on his first game as a Terriers, he fell head first to the boards. The impact left him quadriplegic. In just a moment's time, he went from living his dream to life altering moment. The book provides insights on Roy's struggles and emotions. I am amazed on how he used his unfortunate experience to inspire others.</li>
<li><a href="https://www.amazon.com/Abraham-Lincoln-Lessons-Spiritual-Leadership-ebook/dp/B009AL0B0E/ref=as_li_ss_il?s=digital-text&ie=UTF8&qid=1515039642&sr=1-1&keywords=Abraham+Lincoln:+Lessons+in+Spiritual+Leadership&linkCode=li1&tag=105507-20&linkId=af860903f2fb104911d06262536faae6" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B009AL0B0E&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B009AL0B0E" style="border: none; margin: 0px;" width="1" /><b>Abraham Lincoln: Lessons in Spiritual Leadership. Elton Trueblood . HarperOne. Autobiography. 195 pages.</b> - Many people consider Abraham Lincoln was the best US president. Many noted his leadership skills. Found it interesting that he did not get a lot of formal schooling and the two books that he read the most were the bible and <i>Robinson Crusoe</i>. Not a surprise that bible helped to shape his belief. The book goes into some details on how his spiritual belief lead him to be the leader that he was.</li>
<li><a href="https://www.amazon.com/When-Breath-Becomes-Paul-Kalanithi-ebook/dp/B00XSSYR50/ref=as_li_ss_il?s=digital-text&ie=UTF8&qid=1515039757&sr=1-1&keywords=When+Breath+Becomes+Air&linkCode=li1&tag=105507-20&linkId=9778ea5c4312e588bcb23b6cab867651" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=B00XSSYR50&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=B00XSSYR50" style="border: none; margin: 0px;" width="1" /><b>When Breath Becomes Air. Paul Kalanithi & Abraham Verghese. Random House. Autobiography. 258 pages.</b> - A very touching book. Paul Kalanithi was a neurosurgeon in training when he was diagnosed with terminal cancer. In the book, Kalanithi tried to answer some of the philosophical questions, such as "what makes life meaningful enough to go on living?" Kalanithi had a passion for literature. He wrote the book eloquently. When reading the book, I feel like I can understand his feelings and train of thoughts.</li>
<li><a href="https://www.amazon.com/Hobbit-J-R-Tolkien/dp/054792822X/ref=as_li_ss_il?ie=UTF8&qid=1515088593&sr=8-3&keywords=the+hobbit&linkCode=li1&tag=105507-20&linkId=5637eda6975e34093ed32c3baba0718d" target="_blank"><img border="0" src="//ws-na.amazon-adsystem.com/widgets/q?_encoding=UTF8&ASIN=054792822X&Format=_SL110_&ID=AsinImage&MarketPlace=US&ServiceVersion=20070822&WS=1&tag=105507-20" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" /></a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=105507-20&l=li1&o=1&a=054792822X" style="border: none; margin: 0px;" width="1" /><b>The Hobbit (Lord of the Rings). J.R.R. Tolkien. Houghton Mifflin Harcourt. Fiction. 235 pages.</b> - I've watched the movie before. Enjoyed watching the movie very much. Heard from many people that the book is also good. Got the opportunity to read it last month. I enjoyed reading the book as well.</li>
</ul>
<p><b>Disclaimer:</b> The links above include some referral links to Amazon.com</p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-4319601318423408382017-12-19T18:00:00.001-05:002021-06-09T21:50:40.137-04:00PASS Summit Video Recordings<div class="separator" style="clear: both; text-align: center;">
<a href="https://pixabay.com/en/study-school-learn-education-1968077/" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="Image by workandapix via pixabay" border="0" data-original-height="426" data-original-width="640" height="133" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZnbn-6d-Ceb1YVpiF-8nI8_BG-6RCxyF84Uqdi11_E_el7CDL1tTOmYOokKDjCJTrzIhJbob62vUEj3dRb6dxAWjZ3fy3Br8lE3C5jLsTONNV_biVcWB6Sthg90FU_izVl1CenEzq6izk/w200-h133/study-1968077_640.jpg" width="200" /></a></div>
<p>PASS (Professional Association for SQL Server) Summit is the largest conference for data professional working with Microsoft SQL Server and Business Intelligence stack. Typically, PASS would have the Summit once a year (in the fall) in Seattle, Washington. PASS Summit offers a wealth of knowledge. Each year, there are hundreds of sessions on various SQL Server and Business Intelligence topics presented by industry experts and also Microsoft employees. If you are a data professional, who are working with Microsoft SQL Server and Business Intelligence, PASS Summit is a great event to learn and network with fellow data professionals. If you were unable to attend the past PASS Summits (or if you attended the past PASS Summits and want to relive and review those sessions), you can go to <a href="http://www.pass.org/" target="_blank">SQL PASS site</a> and watch the past <a href="http://www.pass.org/Learning/Recordings/Listing.aspx" target="_blank">PASS Summit Session Recordings</a> for free. As of the writing of this blog post, you can view recordings for PASS Summit 2015 and older for free (you need to be a member of SQL PASS but the registration is free). Even though they might be a few years old, they are excellent learning resources.</p>
<p>By the way, if you can attend PASS Summit in person, I would recommend it. PASS Summit 2018 is going to be in Seattle, Washington, in November 2018. You can get more information from <a href="http://www.pass.org/summit/2018/About.aspx" target="_blank">PASS Summit 2018 site</a>.</p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-68270262116498619842017-12-12T23:15:00.001-05:002021-06-09T21:51:14.727-04:00T-SQL Tuesday #97 - 2018 Learning Goals<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s1600/sqltuesday.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="130" data-original-width="129" height="100" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9hhPrh2NIdvjBo80volMVdnteU9FGBupssnEUQNW46Iotv5aKkC7ZPmUJYSCscGQA1lAhTcz_ZToaBkAJe5qHlo96N8l1AWz54BV4lOmDcn7X6lCJKsIrVc6Kps2HdDFolG_bEvkg6iZT/s200/sqltuesday.png" width="99" /></a></div>
<p>This month T-SQL Tuesday is hosted by Malathi Mahadevan (<a href="https://curiousaboutdata.com/" target="_blank">blog</a> | <a href="https://twitter.com/sqlmal" target="_blank">twitter</a>) and it is about <a href="https://curiousaboutdata.com/2017/12/04/t-sql-tuesday-97-setting-learning-goals-for-2018/" target="_blank">2018 learning goals</a>. Given that we are at the end of 2017, this is a timely topic. As data professionals and technology enthusiasts, we are living in an interesting time. There are a lot of opportunities to learn, so much so that we might feel overwhelmed by them. Goals can certainly help to focus our learning efforts.</p><a name='more'></a>
<h3>Learning Goals</h3>
<p>In 2018, I am planning to learn more on architect-ing and managing cloud solution. As I gain experience working with databases in the cloud, I feel that it would also be beneficial to learn about the different cloud components/services and how they can complement each other.</p>
<h3>Ways to Learn</h3>
<p>To help me focus, I am planning to get Amazon AWS and/or Microsoft Azure certification. Normally, Amazon and/or Microsoft have a guideline (or blueprint) that will tell you the things/skills that you need to know to pass the certification exam. That guideline (or blueprint) can serve as an outline of topics that we can use to learn. To learn the different topics I am planning to read published whitepapers, FAQ articles and blogs. Given the evolving nature of cloud technologies, blogs can help to keep us updated with the latest information or features.</p>
<p>I am also planning to learn by taking online courses. I enjoy doing online courses. It allows me to study at my own time and pace. There are some Amazon AWS and Microsoft Azure online courses available on <a href="https://www.udemy.com/" target="_blank">Udemy</a> and <a href="https://www.edx.org/" target="_blank">edx</a> that I am planning to take.</p>
<p>Finally, I am also a believer in learning by doing. When learning new concept/technology, I like to experiments and have hands-on experience. The good thing is that both Amazon and Microsoft offer us opportunities for us to have hands-on experience with most of their cloud offering for free for 12-months. For Amazon, we can sign up for <a href="https://aws.amazon.com/free/" target="_blank">AWS free tier</a>. For Microsoft, we can sign up <a href="https://azure.microsoft.com/en-us/free/" target="_blank">here</a>.</p>
Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-83199024879589478302017-12-08T06:00:00.001-05:002021-06-09T21:55:44.928-04:00Backup Analysis Services Database Using PowerShell<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpf197XNFwLSUHMkHJplr6d4Qh2VoSediGjDo6vpGro2V5QdEUfws2UmWkAjLJprm4IUtovYZi3ODSer9oewcmDMihiLPXMKNQjDGKkSGpHSddwUX3zzzKPJdhaVrGToAt8qyr5PacQYLQ/s1600/PowerShell_5.0_icon.png" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="PowerShell" border="0" data-original-height="256" data-original-width="256" height="100" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpf197XNFwLSUHMkHJplr6d4Qh2VoSediGjDo6vpGro2V5QdEUfws2UmWkAjLJprm4IUtovYZi3ODSer9oewcmDMihiLPXMKNQjDGKkSGpHSddwUX3zzzKPJdhaVrGToAt8qyr5PacQYLQ/s200/PowerShell_5.0_icon.png" title="PowerShell" width="100" /></a></div>
<p>We can use PowerShell to backup Analysis Services (AS) multidimensional or tabular database. This is great because it offers us more flexibility in automating the AS database backup process. Typically, we might have multiple Analysis Service databases within an Analysis Services instance. As a good Database Administrator responsible for the Analysis Services, we want to backup all those Analysis Services databases.</p><a name='more'></a><p>The PowerShell script below can be used to backup all databases within an Analysis Services instance. To accomplish this, we use Analysis Management Objects (AMO) to get the list of databases available within the Analysis Services instance. Then we use the <a href="https://docs.microsoft.com/en-us/sql/analysis-services/powershell/backup-asdatabase-cmdlet" target="_blank">Backup-ASDatabase cmdlet</a> to backup each database to Analysis Services backup (.abf) file. Alternatively, we could use the AMO backup method to backup the Analysis Services database, but for this script, I choose to use the Backup-ASDatabase cmdlet.</p><pre class="language-powershell"><code>
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$ServerName = "ServerName"
$BackupPath = "C:\Temp"
$Dt = Get-Date -f yyyyMMddhhmmss
# Make sure that the backup folder exist.
$BackupLocation = "$BackupPath\$ServerName"
If (!(Test-Path $BackupLocation))
{
New-Item -ItemType Directory -Force -Path $BackupLocation
}
$Server = New-Object Microsoft.AnalysisServices.Server
$Server.Connect($ServerName)
# Loop thru each database within the SSAS server.
foreach ($Database in $Server.Databases)
{
$DatabaseName = $Database.Name
$BackupFile = $BackupLocation + "\" + $DatabaseName + "_" + $dt + ".abf"
Write-Host "Starting to backup $DatabaseName ..." -ForegroundColor Green
Backup-ASDatabase -Server $ServerName -BackupFile $BackupFile -Name $DatabaseName -ApplyCompression -ErrorAction Stop
Write-Host "$DatabaseName has been backed up successfully." -ForegroundColor Green
}</code></pre><h3>Reference</h3><ul><li><a href="https://docs.microsoft.com/en-us/sql/analysis-services/powershell/backup-asdatabase-cmdlet" target="_blank">Backup-ASDatabase cmdlet</a> - The site contains information on how to use the Backup-ASDatabase cmdlet. It also provides you with the different parameters that you can use. You can get the information by running the "get-help Backup-ASDatabase -detailed" on your PowerShell window.</li></ul>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-24936012977773220092015-07-22T23:38:00.001-04:002021-06-09T21:57:49.432-04:00Remove Log Shipping When Secondary Server Not Available<p>The other day, I came accross a scenario where I need to remove log shipping from a database on the primary server. However, we have already shutdown/decommissioned the secondary server. If we try to disable the log shipping from the database properties window, we would most like get the following:</p><a name='more'></a><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkRuHiXw95rmK6I1LyHqjtMtBk8-3c7n4xmeLk0TUwKzEXJpM7cXeFqQO2y229bZKZucEsLnZdlUQ4S8dO3DCeM2SicM679DdLcOu7URduqpIZXtWy7twDw5F1uV_5dTeG4BwJdMPZI83A/s1600-h/image%25255B9%25255D.png"><img title="Database Properties" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="Database Properties" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPMksxoQqTFKwzB_8MrGcRzRTdEtTU0XNcwLX2uFR06QPQ-c6Qs30M4_6DQCl-VC9-eWPhBXuIduLd63UevL4gQAJowaEbc06uZllxLH7PRQCBDY-7MPb7OijbQ3HHQ4LmqTOt66TAqwWL/?imgmax=800" width="620" height="156" /></a></p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU9_K1f1ncQteS0KQ5r7jnuJ6c9M-EeVC4d7LCu6JX5n_JvzFJOM0ty8WWLHKj7qQItMcM7hxJi8pcMqZRj14TL_2_CwHYGBj4DvOB5xHVxeP1QGbhEu4fj2Xyy2Ic8TannzEWZ4_brKVK/s1600-h/image%25255B13%25255D.png"><img title="Error Deleting Log Shipping Configuration" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="Error Deleting Log Shipping Configuration" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjniNrqgxlFYmFNQ4xBH3BrqxpAFMj7rmu7gvRs67cWksWiIm5iwmz38RvptfIHgOYCINWrnqr76G_SMfLmvbRZWBDzut4IutLiJxzrz3Gr2eTjboHx6O9_AQjoj7oLU1zgCQjnRn0VKfgi/?imgmax=800" width="612" height="318" /></a></p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_N33WXZzuQvwGpGbqVIGyDF3XGmoyiAel6deehHf0uRsTNIG8AAQXfibKjZNCwq5hBWmmhCxjpsuzP2td9Tu7UvT-Lm-LbdEstQHQnBsGYiq_HLj4KV6HMbljjG100S3fmSY_lhNOqNri/s1600-h/image%25255B17%25255D.png"><img title="Could Not Delete Log Shipping Configuration" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLWcoT_B9e6cQ9JwV9tZAfclwS2TQSWeQgCfBSsVroeU_RNVipAQ2ykrAA-PylL7yaOFh9Va7V0mhF9EoOLYUCwlyfH6HMb-koQHPtFBLIx4tGV-a5Pmu7nSMFxQ0c0Zj96T82PfUH-QQO/?imgmax=800" width="615" height="156" /></a></p><p>We would need to remove it manually using the following T-SQL script:</p><pre class="language-sql"><code>
USE [master]
GO
EXEC sp_delete_log_shipping_primary_secondary @primary_database = '[Database_Name]', @secondary_server = '[Secondary_Server_Name]', @secondary_database = '[Database_Name]';
GO
EXEC sp_delete_log_shipping_primary_database @database = '[Database_Name]'
GO</code></pre><h3>References:</h3>
<ul><li>Information on sp_delete_log_shipping_primary_secondary can be found on <a target="_blank" href="https://technet.microsoft.com/en-us/library/ms189467.aspx">https://technet.microsoft.com/en-us/library/ms189467.aspx</a></li><li>Information on sp_delete_log_shipping_primary_database can be found on <a target="_blank" href="https://msdn.microsoft.com/en-us/library/ms188799.aspx">https://msdn.microsoft.com/en-us/library/ms188799.aspx</a></li></ul> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com2tag:blogger.com,1999:blog-8792628637495446545.post-59475895406435122352015-07-20T16:37:00.001-04:002021-06-09T22:00:58.207-04:00Reflection on Steve Stedman's Database Corruption Challenge<p>Over the past few months, Steve Stedman (<a target="_blank" href="http://stevestedman.com/">blog</a> | <a target="_blank" href="https://twitter.com/sqlemt">twitter</a>) has been conducting <a target="_blank" href="http://SteveStedman.com/Corruption">database corruption challenge</a>. It is a challenge where Steve would post corrupt SQL Server databases and the participants would try to fix the corruption to the best of their ability. The participants would need to avoid any data loss when trying to fix the database corruption.</p><a name='more'></a><p>It has been an interesting challenge. It is a good way for us to practice dealing with different scenario of database corruptions. Each week, Steve was able to come out with different set of database corruption problems to solve. For example, in one week, we are able to recover corrupt data by simply querying the data against the non-clustered index. In another week, we would need to get the missing data from SQL Server data page.</p><p>After each challenge, Steve would post a couple of correct solutions sent by the participants. Often times, by reading those solutions, I learned different methods of resolving database corruption (or retrieving corrupt data) than the one that I have. The old saying of "There's more than one way to skin a cat" definitely holds true in many cases.</p><p>In a perfect world, we would not have any corruption in any of our database. However, we don't live in a perfect world. By working thru the database corruption challenge, I become more prepared in troubleshooting database corruption. I would like to thank Steve for making contribution to the SQL community by conducting the Database Corruption Challenge. Hopefully he would do more of this challenges in the future.</p><h3>Additional Resources</h3><p>While working on the database corruption challenge, some of the resources that I found useful and/or interesting are:</p><ul><li>Paul Randal's "Corruption Demo Databases and Scripts". Paul Randal is one of the expert subject matter when it comes to database corruption. He has several databases corruption demo and scripts that we can download. It can be found <a target="_blank" href="http://www.sqlskills.com/blogs/paul/corruption-demo-databases-and-scripts/">here</a>.</li><li>Mark Rasmussen article on "Deciphering SQL Server Data Page". It explains in some details on how to read the output of DBCC Page. The article can be found <a target="_blank" href="http://improve.dk/deciphering-a-sql-server-data-page/">here</a>.</li><li>Steve Stedman's presentation of "When Database Corruption Strikes" to the PASS DBA Virtual Chapter. It can be found <a target="_blank" href="https://www.youtube.com/watch?v=eXFcjwjApbs&feature=youtu.be">here</a>.</li><li>Kendra Little's "Corrupting Databases for Dummies-Hex Editor Editor". It provides some information on using Hex Editor to corrupt database. The article can be found <a target="_blank" href="http://www.littlekendra.com/2011/01/24/corrupthexeditor/">here</a>.</li></ul>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-91341646635534136922015-05-15T12:49:00.004-04:002021-06-09T22:26:01.835-04:00Change Data Capture (CDC) and Simple Recovery Mode<p>One of the misconceptions that some people have about change data capture (CDC) in SQL Server is that it needs the database to be set up with full recovery mode. If we look at the following <a title="Tuning the Performance of Change Data Capture in SQL Server 2008" href="https://technet.microsoft.com/en-us/library/dd266396(v=sql.100).aspx" target="_blank">TechNet article</a>, it mentioned:</p> <blockquote>It should be noted that change data capture works with all recovery models. But when change data capture is enabled, operations that would normally be minimally logged in simple or bulk-logged recovery models are fully logged to enable change data capture to capture all changes.</blockquote><a name='more'></a> <p>So Change Data Capture can be enabled on database with simple recovery mode. This can be helpful as typically in some development environment, the databases are being set with simple recovery mode. One thing to be careful is that if we have a database with simple recovery mode and have CDC enabled, but it doesn’t have the CDC capture job running, as the database process transactions, the database’s transaction log would use more space. This is because those transactions will not get release until the CDC capture process those transactions. Let me try to demonstrate this.</p> <h3>Database in Simple Recovery Mode without CDC</h3> <p>As a base case, let us set up a database with simple recovery mode.</p> <pre class="language-sql"><code>USE [master];
GO
IF EXISTS(SELECT 1 FROM sys.databases WHERE [name] = N'CdcTest')
BEGIN
DROP DATABASE CdcTest;
END
GO
CREATE DATABASE CdcTest
ON (
Name = 'CdcTest',
FileName = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\CdcTest.mdf',
Size = 10 MB,
FileGrowth = 5 MB)
LOG ON (
Name = 'CdcTest_Log',
FileName = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\CdcTest_Log.ldf',
Size = 50 MB,
FileGrowth = 5 MB);
ALTER DATABASE CdcTest
SET RECOVERY SIMPLE;
GO</code></pre><p>Now, let us create a table called “TableWithoutCDC”.</p><pre class="language-sql"><code>USE [CdcTest];
GO
CREATE TABLE dbo.TableWithoutCDC (
Id INT PRIMARY KEY IDENTITY(1,1),
Field1 VARCHAR(4000),
Field2 VARCHAR(4000),
Field3 VARCHAR(4000),
Field4 VARCHAR(4000),
Field5 VARCHAR(4000));</code></pre><p>Now, try to insert some data into the table. Basically, the script below would do a one row insert in a transaction. It would do it 5,000 times.</p><pre class="language-sql"><code>USE [CdcTest];
GO
DECLARE @z AS INT;
SET @z = 1;
WHILE (@z <= 5)
BEGIN
DECLARE @a AS INT;
SET @a = 1
WHILE (@a <= 1000)
BEGIN
DECLARE @Filler AS VARCHAR(4000);
SET @Filler = REPLICATE('a', 4000);
BEGIN TRANSACTION
INSERT INTO dbo.TableWithoutCDC
VALUES (@Filler, @Filler, @Filler, @Filler, @Filler)
COMMIT TRANSACTION
SET @a = @a + 1
END
CHECKPOINT;
SET @z = @z + 1
END</code></pre><p>Now, try to get the database’s transaction log space usage:</p><pre class="language-sql"><code>USE [CdcTest];
GO
SELECT Name,
[FileName],
file_in_MB = (CAST(size AS FLOAT) * 8)/ 1024,
file_used_in_MB = (CAST(FILEPROPERTY(name, 'SpaceUsed') AS FLOAT) * 8)/ 1024
FROM sys.sysfiles</code></pre><p>We got the following result:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWMOyPZv8lEdL0uzRNFj2de2OXVApCR_0YIkxIy0vPWyRzOX-th1ij0jYXYDGj_i2rRMCxclVTlkBqFfcSsFCq6avCsDI0Ytdtm44j4qDySwNy-iEeG5bK6c3NbqqPZHF80fH6TD20asFt/s1600-h/image%25255B3%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoVjZh5IQzge9-PHzzIavjJ-wg_t5h3NornjdP5lc4vbIK0KK7456ZP8soJXRrbp_vI3Yg15H6opdkRUX7VDQZlnjF107Q0TQQDU-UNRWlkLQdvYoF-Eq9tlZ6JCIPCV3BXQZhpajj_elL/?imgmax=800" width="644" height="52" /></a></p><p>The database’s transaction log file is using 0.8 MB of disk space.</p><h3>Database in Simple Recovery Mode with CDC</h3><p>Now, let see the impact of CDC, with the CDC capture job not running, on database's transaction log. First, let us reset the CdcTest database.</p><pre class="language-sql"><code>USE [master];
GO
IF EXISTS(SELECT 1 FROM sys.databases WHERE [name] = N'CdcTest')
BEGIN
DROP DATABASE CdcTest;
END
GO
CREATE DATABASE CdcTest
ON (
Name = 'CdcTest',
FileName = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\CdcTest.mdf',
Size = 10 MB,
FileGrowth = 5 MB)
LOG ON (
Name = 'CdcTest_Log',
FileName = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\CdcTest_Log.ldf',
Size = 50 MB,
FileGrowth = 5 MB);
ALTER DATABASE CdcTest
SET RECOVERY SIMPLE;
GO</code></pre><p>Then, create a table called "TableWithCDC".</p><pre class="language-sql"><code>USE [CdcTest];
GO
CREATE TABLE dbo.TableWithCDC (
Id INT PRIMARY KEY IDENTITY(1,1),
Field1 VARCHAR(4000),
Field2 VARCHAR(4000),
Field3 VARCHAR(4000),
Field4 VARCHAR(4000),
Field5 VARCHAR(4000));</code></pre><p>Now, we can try to enable CDC on CdcTest database and enable it to capture changes on the TableWithCDC.</p><pre class="language-sql"><code>USE [CdcTest];
GO
EXEC sp_cdc_enable_db;
EXEC sp_cdc_enable_table @source_schema = 'dbo', @source_name = 'TableWithCDC', @capture_instance = 'dbo_TableWithCDC', @supports_net_changes = 1, @role_name = NULL;
</code></pre><p>By default, when CDC is enabled, it would create 2 SQL Agent jobs, one is to capture the changes and another one is to clean up the CDC data older than 3 days (default, but it can be modify). For this testing, if you have SQL Agent running, you might want to stop and disabled the CDC capture job. You can do that using the following script:</p><pre class="language-sql"><code>USE [msdb];
GO
EXEC dbo.sp_stop_job @job_name = 'cdc.CdcTest_capture';
EXEC dbo.sp_update_job @job_name = 'cdc.CdcTest_capture', @enabled = 0;</code></pre><p>Now, similar to the first test, try to have some transactions in the database.</p><pre class="language-sql"><code>USE [CdcTest];
GO
DECLARE @z AS INT;
SET @z = 1;
WHILE (@z <= 5)
BEGIN
DECLARE @a AS INT;
SET @a = 1
WHILE (@a <= 1000)
BEGIN
DECLARE @Filler AS VARCHAR(4000);
SET @Filler = REPLICATE('a', 4000);
BEGIN TRANSACTION
INSERT INTO dbo.TableWithCDC
VALUES (@Filler, @Filler, @Filler, @Filler, @Filler)
COMMIT TRANSACTION
SET @a = @a + 1
END
CHECKPOINT;
SET @z = @z + 1
END</code></pre><p>Now, let we see how it impacts the database's transaction log usage.</p><pre class="language-sql"><code>USE [CdcTest];
GO
SELECT Name,
[FileName],
file_in_MB = (CAST(size AS FLOAT) * 8)/ 1024,
file_used_in_MB = (CAST(FILEPROPERTY(name, 'SpaceUsed') AS FLOAT) * 8)/ 1024
FROM sys.sysfiles</code></pre><p>The database's transaction log is now using about 116 MB of disk space, much higher than the 0.8 MB of disk space in the first test. If we look at the log_reuse_wait_desc column on sys.databases table for CdcTest database:</p><pre class="language-sql"><code>SELECT [name], log_reuse_wait_desc
FROM sys.databases
WHERE [name] = 'CdcTest'</code></pre><p>The reason that the database could not reuse the log space is because "replication", which actually make sense since under the hood CDC is using the same technology as replication. Basically, the database will not release the transactions in the transaction log until they are processed by the CDC capture job/process.</p><p>Now, let's try to process the CDC changes, do a checkpoint and check the database's transaction log space usage:</p><pre class="language-sql"><code>USE [CdcTest];
GO
EXEC sp_cdc_scan;
GO
CHECKPOINT;
GO
SELECT Name,
[FileName],
file_in_MB = (CAST(size AS FLOAT) * 8)/ 1024,
file_used_in_MB = (CAST(FILEPROPERTY(name, 'SpaceUsed') AS FLOAT) * 8)/ 1024
FROM sys.sysfiles</code></pre><p>The database's transaction log disk usage immediate drops to about 4 MB.</p><p>Additional thing to note is that, on the above example, we did those transactions on a table that is being tracked with CDC. This would not happen if the transactions are run against any table that is not being tracked by CDC on CDC enabled database. The database transaction log will reuse the log space after a checkpoint. The following script can help to illustrate this:</p><pre class="language-sql"><code>USE [CdcTest];
GO
--- Create a table that is not being tracked by CDC within CDC enabled database.
CREATE TABLE dbo.TableWithoutCDC (
Id INT PRIMARY KEY IDENTITY(1,1),
Field1 VARCHAR(4000),
Field2 VARCHAR(4000),
Field3 VARCHAR(4000),
Field4 VARCHAR(4000),
Field5 VARCHAR(4000));
GO
--- Run transactions against the TableWithoutCDC
DECLARE @z AS INT;
SET @z = 1;
WHILE (@z <= 5)
BEGIN
DECLARE @a AS INT;
SET @a = 1
WHILE (@a <= 1000)
BEGIN
DECLARE @Filler AS VARCHAR(4000);
SET @Filler = REPLICATE('a', 4000);
BEGIN TRANSACTION
INSERT INTO dbo.TableWithoutCDC
VALUES (@Filler, @Filler, @Filler, @Filler, @Filler)
COMMIT TRANSACTION
SET @a = @a + 1
END
CHECKPOINT;
SET @z = @z + 1
END
GO
--- Now get the database transaction log disk usage
SELECT Name,
[FileName],
file_in_MB = (CAST(size AS FLOAT) * 8)/ 1024,
file_used_in_MB = (CAST(FILEPROPERTY(name, 'SpaceUsed') AS FLOAT) * 8)/ 1024
FROM sys.sysfiles</code></pre><p>The database will report that the transaction log is using around 2 MB of space (the number might vary but it should be a small number).</p><h3>Clean Up</h3><p>You can use the following script to clean:</p><pre class="language-sql"><code>USE [master];
GO
IF EXISTS(SELECT 1 FROM sys.databases WHERE [name] = N'CdcTest')
BEGIN
DROP DATABASE CdcTest;
END
GO</code></pre><h3>Take Away</h3><p>Some take away points: <ul><li>CDC can be enabled on database with any recovery models, including database with simple recovery mode.</li><li>Make sure that the CDC capture job is running, as transactions against CDC tracked table(s) will be preserved in the transaction log until they are processed by CDC capture job/process. This is especially true if you restore CDC enabled database with KEEP_CDC option, as the CDC capture and clean up jobs would not get created by default</li><li>If you have CDC tracked tables and you don't have the CDC capture job running, you can potentially see your database's transaction log grow even if it is in a simple recovery mode.</li></ul></p>Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com2tag:blogger.com,1999:blog-8792628637495446545.post-40005879294181769712015-05-05T23:03:00.006-04:002021-06-10T00:03:47.523-04:00SQL Script: Database Auto Grow and/or Auto Shrink Report<p>Below is a SQL script that can be used to list recent auto grow and/or auto shrink events that happened on your SQL Server. It queries the SQL Server default trace. By default, default trace is enabled (you can check this by running: <strong>sp_configure 'default trace enabled'</strong>).</p><a name='more'></a> <pre class="language-sql"><code>DECLARE @TracePath NVARCHAR(1000);
-- Get the file path for the default trace
SELECT
@TracePath =
REVERSE(SUBSTRING(REVERSE([path]),
CHARINDEX('\', REVERSE([path])), 1000)) + 'log.trc'
FROM
sys.traces
WHERE
is_default = 1;
-- Query to get auto growth and shrink event from the default trace
SELECT
EventName = te.[name],
tr.ServerName,
tr.DatabaseName,
tr.[FileName],
FilePath = sf.[filename],
Duration_in_ms = tr.Duration/1000,
FileGrowth_in_mb = (tr.IntegerData*8)/1024,
tr.StartTime,
tr.EndTime
FROM
sys.fn_trace_gettable(@TracePath, DEFAULT) tr
INNER JOIN sys.trace_events te
ON tr.EventClass = te.trace_event_id
INNER JOIN sys.sysaltfiles sf
ON tr.[FileName] = sf.[name]
WHERE
te.name in (
'Data File Auto Grow',
'Log File Auto Grow',
'Data File Auto Shrink',
'Log File Auto Shrink')
ORDER BY tr.StartTime DESC;</code></pre><p>By the way, you can also get the information on recent auto grow and/or auto shrink events in SQL Server Management Studio, by doing the following:</p><ul><li>Open SQL Server Management Studio (SSMS)</li><li>On the Object Explorer (you can press <strong>F8</strong> to view Object Explorer), right click on the database that you want to get the auto grow and/or auto shrink events on</li><li>On the context menu, select <strong>Reports</strong> and then <strong>Standard Reports</strong>. Then select <strong>Disk Usage</strong>.</li><li>It would open a <strong>Disk Usage</strong> report for that particular database. If there is any recent auto grow and/or autoshrink event on that database, you should be able to see the “<strong>Data/Log Files Autogrow/Autoshrink Events</strong>” section.</li></ul><p><a href="http://lh3.googleusercontent.com/-B5p22yKCHgY/VUmDSZYChnI/AAAAAAAAAv8/_wfCJ7yGk74/s1600-h/image%25255B3%25255D.png"><img title="Data/Log Files Autogrow/Autoshrink Events" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="Data/Log Files Autogrow/Autoshrink Events" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJz71awTl14J34zvNWMO2o3vqrsy8Ucngrn5MD2iL0ZvO21cKPj2iW2T3YlRKOoPN2UgMy7vVYhNE6i2JXGxrJvF04WEs61dJjiBG8SFa-BHmAHuWusKAnf5pjTT7lMixPtsG89hmWtUDu/?imgmax=800" width="561" height="484" /></a></p><p>Hopefully, you don’t have the auto shrink feature enabled on any of your database. Thus, you don’t have any auto shrink events to worry about.</p><p>As for auto grow events, if you notice a lot of them, you might want to consider pre-grow (increase the size of) the data/log file during period of low activity. Another thing that you might consider if you are noticing a lot of auto grow events in your database is that you can make the auto growth size to be bigger. </p><p>If you experience frequent auto grow events in your database log file, it can lead to <a title="High Virtual Log File Count" href="http://www.lucasnotes.com/2014/02/high-virtual-log-files-vlf-count.html">high Virtual Log Files (VLF) count</a>.</p> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-48704360716550265462015-04-29T00:23:00.001-04:002021-06-09T22:52:47.125-04:00PowerShell Script: List SQL Server Analysis Service Role Permission By Database and Cube<p>A while ago, I wrote a couple of blog entries in regards to using PowerShell scripts in order to list roles within the database and their members. The script can be found <a title="List SSAS User Roles Using PowerShell (Part 2)" href="http://www.lucasnotes.com/2013/06/list-ssas-user-roles-using-powershell.html">here</a> and <a title="List SSAS User Role Using PowerShell" href="http://www.lucasnotes.com/2012/09/list-ssas-user-roles-using-powershell.html">here</a>.</p> <p>Let’s try to expand on those scripts further.</p><a name='more'></a> <h3>List Role Permission For Each Analysis Service Database</h3> <p>If we look at the GUI, on the database level, we can assign the role to have one or more of the following permissions: full control, process database and read definition. </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_UTCY5UtDb_ZPTYzjo2z90UrUE6r0ytm_o81Gbv_cUYxvxciSnXrlSHKWXDR6ehZKgcvGh6RY9YDtvMC2adr48f35Bj0Vt1wdOM8OB6dp4prcxLdSx-jR8fELikSjbvkZO6BQqaR5Vd1c/s1600-h/image%25255B3%25255D.png"><img title="SSAS Role Properties" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="SSAS Role Properties" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9TFAosGaRo_s4RnlEO1R3ptmbJSGPvROnxyGDalbC81k6pC1Bu4SmVNTK_anHe3uB3fVq10Rlhq6na1S8ST50S_BIHajXvyODX0S47qotPfFJ6ut7xZnfMTsrgyrMUFzQTOxZhXAhZcPm/?imgmax=800" width="539" height="484" /></a></p> <p>We can use the following PowerShell script to list role permission for each database:</p> <pre class="language-powershell"><code>[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
# SSAS server name variable<br />$SSASServerName = "ServerName"
# Try to connect to the SSAS server
$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.Connect($SSASServerName)
# Object to store the result<br />$Result = @()
# Get the SSAS databases and loop thru each of them
foreach ($DB in $SSASServer.Databases)
{
# Get the SSAS database
$SSASDatabase = $SSASServer.Databases.Item($DB.name)
foreach ($Role in $SSASDatabase.Roles)
{
$DatabasePermission = $DB.DatabasePermissions.GetByRole($Role.ID)
$ItemResult = New-Object System.Object
$ItemResult | Add-Member -type NoteProperty -name DatabaseName -value $DB.Name
$ItemResult | Add-Member -type NoteProperty -name RoleName -value $Role.Name
$ItemResult | Add-Member -type NoteProperty -name DatabaseAdministrator -value $DatabasePermission.Administer
$ItemResult | Add-Member -type NoteProperty -name DatabaseProcess -value $DatabasePermission.Process
$ItemResult | Add-Member -type NoteProperty -name DatabaseReadDefinition -value $DatabasePermission.ReadDefinition
$Result +=$ItemResult
}
}
$Result | Select DatabaseName, RoleName, DatabaseAdministrator, DatabaseProcess, DatabaseReadDefinition | format-table * -Wrap -AutoSize | Out-String</code></pre><h3>List Role Permission For Each Cube within the Analysis Service Database</h3><p>We can also define role permission for each cube within the Analysis Service database. If we look at the GUI, the basic role permissions for each cube would look like:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjgpLNFVWB7lGlsCAdoxs-HZNtBk_9h1XajG-VctiJrAhP7QEZrsVBPEEpMHMPgGETWAPzytiaY8i_QvrIGNT_uDoexTlvNbQ26KE87g9WNmT0dLPFwiqDlEUZ-zy3VlMJaY6HeJfhQmCvu/s1600-h/image%25255B7%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhaa6LyBPqH3F0VfYp7SosFDBtjQ2nTP8_APnMJ3U5Ny_5xWjbZmzHUCvLlRStLJThqBqds4zrptRxvZ8WNMVWrbOQMvfH4hMiLgKx7huLuaigHHy9Qcypzno8l2_wi3Oy48ooA7WdtbdK6/?imgmax=800" width="539" height="484" /></a></p><p>We can use the following PowerShell script to iterate the permission of each role for each cube within each database:</p><pre class="language-powershell"><code>[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
# SSAS server name variable
$SSASServerName = "ServerName"
# Try to connect to the SSAS server
$SSASServer = New-Object Microsoft.AnalysisServices.Server<br />$SSASServer.Connect($SSASServerName)
# Object to store the result<br />$Result = @()
# Get the SSAS databases and loop thru each of them
foreach ($DB in $SSASServer.Databases)
{
# Get the SSAS database
$SSASDatabase = $SSASServer.Databases.Item($DB.name)
foreach ($Role in $SSASDatabase.Roles)
{
# Get the SSAS cubes within the database
foreach ($Cube in $DB.Cubes)
{
$CubePermission = $Cube.CubePermissions.GetByRole($Role.ID)
$ItemResult = New-Object System.Object
$ItemResult | Add-Member -type NoteProperty -name DatabaseName -value $DB.Name
$ItemResult | Add-Member -type NoteProperty -name RoleName -value $Role.Name
$ItemResult | Add-Member -type NoteProperty -name CubeName -value $Cube.Name
$ItemResult | Add-Member -type NoteProperty -name CubeRead -value $CubePermission.Read.value__
$ItemResult | Add-Member -type NoteProperty -name CubeWrite -value $CubePermission.Write.value__
$ItemResult | Add-Member -type NoteProperty -name CubeProcess -value $CubePermission.Process
$ItemResult | Add-Member -type NoteProperty -name CubeReadDefinition_Local -value $CubePermission.ReadDefinition.value__
$ItemResult | Add-Member -type NoteProperty -name CubeReadSourceData_Drillthrough -value $CubePermission.ReadSourceData.value__
$Result +=$ItemResult
}
}
}
$Result | Select DatabaseName, RoleName, CubeName, CubeRead, CubeWrite, CubeProcess, CubeReadDefinition_Local, CubeReadSourceData_Drillthrough| format-table * -Wrap -AutoSize | Out-String</code></pre><h3>Combining the Scripts</h3><p>As a bonus, below is the PowerShell script that can be used to list the role permissions for the cubes and databases within the SQL Server Analysis Services:</p><pre class="language-powershell"><code>[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$FormatEnumerationLimit = -1
# SSAS server name variable
$SSASServerName = "ServerName"
# Try to connect to the SSAS server
$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.Connect($SSASServerName)
# Object to store the result
$Result = @()
# Get the SSAS databases and loop thru each of them
foreach ($DB in $SSASServer.Databases)
{
# Get the SSAS database
$SSASDatabase = $SSASServer.Databases.Item($DB.name)
foreach ($Role in $SSASDatabase.Roles)
{
$DatabasePermission = $DB.DatabasePermissions.GetByRole($Role.ID)
# Get the SSAS cubes within the database
foreach ($Cube in $DB.Cubes)
{
$CubePermission = $Cube.CubePermissions.GetByRole($Role.ID)
$ItemResult = New-Object System.Object
$ItemResult | Add-Member -type NoteProperty -name DatabaseName -value $DB.Name
$ItemResult | Add-Member -type NoteProperty -name RoleName -value $Role.Name
$ItemResult | Add-Member -type NoteProperty -name DatabaseAdministrator -value $DatabasePermission.Administer
$ItemResult | Add-Member -type NoteProperty -name DatabaseProcess -value $DatabasePermission.Process
$ItemResult | Add-Member -type NoteProperty -name DatabaseReadDefinition -value $DatabasePermission.ReadDefinition
$ItemResult | Add-Member -type NoteProperty -name CubeName -value $Cube.Name
$ItemResult | Add-Member -type NoteProperty -name CubeRead -value $CubePermission.Read.value__
$ItemResult | Add-Member -type NoteProperty -name CubeWrite -value $CubePermission.Write.value__
$ItemResult | Add-Member -type NoteProperty -name CubeProcess -value $CubePermission.Process
$ItemResult | Add-Member -type NoteProperty -name CubeReadDefinition_Local -value $CubePermission.ReadDefinition.value__
$ItemResult | Add-Member -type NoteProperty -name CubeReadSourceData_Drillthrough -value $CubePermission.ReadSourceData.value__
$ItemResult | Add-Member -type NoteProperty -name RoleMembers -value ($Role.Members | Select -ExpandProperty Name)
$Result +=$ItemResult
}
}
}
$Result | Select DatabaseName, RoleName, DatabaseAdministrator, DatabaseProcess, DatabaseReadDefinition, `
CubeName, CubeRead, CubeWrite, CubeProcess, CubeReadDefinition_Local, CubeReadSourceData_Drillthrough, RoleMembers | format-table * -Wrap -AutoSize | Out-String</code></pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com4tag:blogger.com,1999:blog-8792628637495446545.post-83438267069494879002015-04-22T00:22:00.001-04:002021-06-09T22:54:39.318-04:00PowerShell Script: Adding Database into Always On Group<p>Every once in a while, I have  a new database that I need to put into an existing Always On group. So I have decided to create a PowerShell script that would do just that. The script can be found <a title="PowerShell script to put database into Always On" href="https://drive.google.com/file/d/0B1rAJDpqc-JdY2pudmotczI2TkE/view?usp=sharing" target="_blank">here</a>.</p><a name='more'></a> <p>The script accepts the following parameters:</p> <ul> <li><strong>DBName</strong> – The name of the database that we need to put into Always On Group.</li> <li><strong>DAGName</strong> – The name of the Always On group</li> <li><strong>ReplicaNode</strong> – Either the primary replica or one of the secondary replica of the Always On Group</li> <li><strong>BackupLocation</strong> – Network share location in which each of the replica node can read or write the database backup</li> </ul> <h3>Assumptions</h3> <ul> <li>The database is located in the primary replica node of the Always On group.</li> <li>The Always On group has already been created, but the database has not been joined to the group.</li> </ul> <h3>Script Logic</h3> <p>In summary, the script would do the following:</p> <ul> <li>Information gathering</li> <ul> <li>Find Always On primary replica</li> <li>Find Always On secondary replica(s)</li> </ul> <li>Run some validations</li> <ul> <li>Check to see if the database exists</li> <li>Database is not one of the system database (master, model, tempdb or msdb)</li> <li>Database is online</li> <li>Database is in full recovery mode</li> <li>Database is currently not in any Always On group</li> </ul> <li>Backup the database (full and transaction log) from the primary replica into the backup location</li> <li>Restore the database using the full and transaction log backup onto each of the secondary replica</li> <li>Add the database into Always On group on the primary replica and each of the secondary replica</li> </ul> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-60687485617918141052015-04-14T08:00:00.000-04:002015-04-22T17:40:32.668-04:00Stale View<p>If you have non-schema bound view in SQL Server and the underlying table was updated, you potentially can have a stale view. A quick way to demonstrate this:</p> <p>Let say that you have the following table:</p> <pre class="brush: sql;">CREATE TABLE dbo.Contact (<br /> Id INT IDENTITY(1,1) PRIMARY KEY,<br /> FirstName VARCHAR(10),<br /> LastName VARCHAR(10));<br />GO<br /><br />INSERT INTO dbo.Contact VALUES ('John', 'Doe');<br />INSERT INTO dbo.Contact VALUES ('Jane', 'Doe');</pre><p>Then, create a simple view against the newly created table:</p><pre class="brush: sql;">CREATE VIEW dbo.vwContact<br />AS<br /> SELECT *<br /> FROM dbo.Contact<br />GO</pre><p>If we run the following query:</p><pre class="brush: sql;">SELECT Id, FirstName, LastName<br />FROM dbo.vwContact</pre><p>We would get the expected result:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2mYp1KGc4x57nn-OZ_ceicn1aO3MfMhtqdNFwImUlzjGarL2BTquIZwzYqdsQp9_8qGC0gLccPX2pvkZ1Evl-BGvSTPQkTkd3IybtqpURrwMF8rOP0INL6D7UDXYzCvpBWEM9PFX0zpzU/s1600-h/image%25255B5%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkhIehe-bRNzGBEE808YWp7HQB9Ukp6qd8kZu1OJg2DeuswsAg5R8MtX52m8VXVBxipip0WDRoZszXRzpOWwWHUnSobUilkebQU8CK9lTqHPfXrsbcObvCotfoyj2_zPbtcz7KxZMJ4YUk/?imgmax=800" width="194" height="64" /></a></p><p>Now, what happen if we add a new column to the Contact table:</p><pre class="brush: sql;">ALTER TABLE dbo.Contact<br />ADD Country VARCHAR(10) NULL;<br />GO<br /><br />UPDATE dbo.Contact<br />SET Country = 'USA'<br />WHERE Id = 1;</pre><p>If we run the following query against the vwContact view:</p><pre class="brush: sql;">SELECT Id, FirstName, LastName, Country<br />FROM dbo.vwContact;</pre><p>We would get the following error message:</p><blockquote><br /> <p>Msg 207, Level 16, State 1, Line 28<br /> <br />Invalid column name 'Country'.</p><br /></blockquote><p>That’s odd, it looks like the view does not recognize the newly added Country column.</p><p>Ok, let say that your company got a new contact, his name is Mr. Rhino Hippopotamus. Obviously, you will need to have a bigger last name column. So you decide to change the LastName column from varchar(10) to varchar(20).</p><pre class="brush: sql;">ALTER TABLE dbo.Contact<br />ALTER COLUMN LastName VARCHAR(20);<br />GO<br /><br />INSERT INTO dbo.Contact VALUES ('Rhino', 'Hippopotamus', 'USA');<br />GO</pre><p>If you try to query the view using the following query (let’s forget about the Country column for now):</p><pre class="brush: sql;">SELECT Id, FirstName, LastName<br />FROM dbo.vwContact;</pre><p>That seems to work just fine. You get the expected result:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-v5N31PQm8ZuSERvnOfkIVKhLqPAr71VWgByJLOKGgTzEYuo8XiQKBk-yWMuEiymtT0RVlpRCeHFVfyiACq1ZrybWhe7d9U4CMUZm82XZlFjMLGqepeHnIcTc4BuP3Kde6nuhsCJrGYjZ/s1600-h/image%25255B8%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; margin: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiH12I13c62cZiVzlTt5OCSyQ0pwBqq9As3JDpOIHSqqLFxo1g_mW3oyVbuo29jYGuxNXfWCQR_as3iQsZjrFFtkPa5g7Jj0FyxR7bS7KEArFmPWcNo-ik8S8CFD_EMo3rs1X-wKw_T5e0B/?imgmax=800" width="207" height="81" /></a></p><p>But, what happen if you try to run the same query from another server, using a linked server:</p><pre class="brush: sql;">SELECT Id, FirstName, LastName<br />FROM ServerA.TestDB.dbo.vwContact;</pre><p>Instead of getting the expected result, we got the following error:</p><blockquote><br /> <p>Msg 7347, Level 16, State 1, Line 1<br /> <br />OLE DB provider 'SQLNCLI11' for linked server 'HSQL01' returned data that does not match expected data length for column '[ServerA].[TestDB].[dbo].[vwContact].LastName'. The (maximum) expected data length is 10, while the returned data length is 12.</p><br /></blockquote><p>If we check on the column property for the LastName column of vwContact view from the server that has the linked server to the original server, using the following query, we would notice that somehow, the LastName column is still listed as having 10 characters long:</p><pre class="brush: sql;">USE [master];<br />GO<br /><br />EXEC sp_columns_ex 'ServerA', 'vwContact', 'dbo', 'TestDB', 'LastName', NULL;</pre><h3>Solution</h3><p>So how to address the issue that we are having with stale view. There are several ways:</p><p>We could drop and recreate the view. In the above example, we can run the following query:</p><pre class="brush: sql;">IF (OBJECT_ID('dbo.vwContact') IS NOT NULL)<br /> DROP VIEW dbo.vwContact<br />GO<br /><br />CREATE VIEW dbo.vwContact<br />AS<br /> SELECT *<br /> FROM dbo.Contact<br />GO</pre><p>We could also run an alter view statement:</p><pre class="brush: sql;">ALTER VIEW dbo.vwContact<br />AS<br /> SELECT *<br /> FROM dbo.Contact<br />GO</pre><p>Alternatively, we could refresh the view using the following script:</p><pre class="brush: sql;">EXEC sp_refreshview 'dbo.vwContact';</pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-62895919433373536012015-03-24T20:51:00.001-04:002015-04-01T08:57:01.393-04:00SQL Server Agent Job Last Run Time and Result<p>I've seen situations in which we have some SQL Server Agent jobs that are scheduled to run very rarely  (like once every year or once every half year). Either because of audit or troubleshooting, we need to find out those SQL Server Agent jobs last run time and their outcome. However, the SQL Server Agent history is being purged every few months (<strong>Notes:</strong> purging SQL Server Agent history periodically is not a bad thing, it is actually a good practice to keep the msdb database size manageable, esp. if your environment has lots of SQL Server Agent jobs, in which many of them have lots of job steps, and some of the jobs are scheduled to run frequently - every few minutes or even seconds).</p> <p>Are we out of luck? Well not necessarily.</p> <h3>Demo</h3> <p>First create a SQL Agent job with the following script:</p> <pre class="brush: sql;">USE [msdb]<br />GO<br /><br />BEGIN TRANSACTION<br />DECLARE @ReturnCode INT<br />SELECT @ReturnCode = 0<br /><br />IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)<br />BEGIN<br />EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br /><br />END<br /><br />DECLARE @jobId BINARY(16)<br />EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'SQL Job Test', <br /> @enabled=1, <br /> @notify_level_eventlog=0, <br /> @notify_level_email=0, <br /> @notify_level_netsend=0, <br /> @notify_level_page=0, <br /> @delete_level=0, <br /> @description=N'No description available.', <br /> @category_name=N'[Uncategorized (Local)]', <br /> @owner_login_name=N'sa', @job_id = @jobId OUTPUT<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br /><br />EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Job Step 1', <br /> @step_id=1, <br /> @cmdexec_success_code=0, <br /> @on_success_action=3, <br /> @on_success_step_id=0, <br /> @on_fail_action=2, <br /> @on_fail_step_id=0, <br /> @retry_attempts=0, <br /> @retry_interval=0, <br /> @os_run_priority=0, @subsystem=N'TSQL', <br /> @command=N'select 1', <br /> @database_name=N'master', <br /> @flags=0<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br /><br />EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Job Step 2', <br /> @step_id=2, <br /> @cmdexec_success_code=0, <br /> @on_success_action=1, <br /> @on_success_step_id=0, <br /> @on_fail_action=2, <br /> @on_fail_step_id=0, <br /> @retry_attempts=0, <br /> @retry_interval=0, <br /> @os_run_priority=0, @subsystem=N'TSQL', <br /> @command=N'select 2', <br /> @database_name=N'master', <br /> @flags=0<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br />EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br />EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'<br />IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback<br />COMMIT TRANSACTION<br />GOTO EndSave<br />QuitWithRollback:<br /> IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION<br />EndSave:<br /><br /><br />GO</pre><p>Then we run the job using the following script:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />EXEC sp_start_job @job_name=N'SQL Job Test';<br />GO</pre><p>Once that has been completed, there should be a job history for the SQL Server agent job:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKGo1fVKTXn-4JboxAYJ_jOD1SHRhBbW8_w-zHSSXVTkHbAyOXlAkARIyzXxYmxkLxDrEJFifOsmmPcGj06MxS9L3ZfPThGMbVY8IyhxAVUXTj5xSzpK-HECpbgam2W_MHoa0SO0eQEZVd/s1600-h/image%25255B13%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtbxviPj5dpSiuf1N5R5IC_TCF7QaV6vk-zpMgL1ByKbefN8DJpHOKxCfUvZUKmJwvBBgSz4L4ob1fm4WLvsq-pHA0ckZNlljXz-ToskJD8uyWOYtauG4fo0Ly5f1bfuTsh3nOBdzrXlQn/?imgmax=800" width="644" height="477" /></a></p><p>Now purge the job history by running the following:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />EXEC dbo.sp_purge_jobhistory @job_name=N'SQL Job Test';<br />GO<br /></pre><p>After purging the job history, we will no longer have the entries for “SQL Job Test” job on dbo.sysjobhistory table in the msdb database.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiagUnD3Qweu58avxdZoDW0u8RLTTfpFLKWESTnED_Xf9ylS0ej3sPGvXnKiDB3r9HB_ASdOpzugynxHY69o8T9AX3Ve33kVX-FU-Q6yP4EIzydOAuMYMrpciNdaSfuGDiR9P9O4j4CP1-R/s1600-h/image%25255B17%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgj1stZQP3LZ9l1FFv2V68jMqeAaoUGt2AxMQdg7OQloK7RZ-l7UkEXbP3OJRZvggopOmL0Ul2xQUBLjVTeYo-yGghPnvts9-7lB2t4fSpE1pHS7mJ-q07eVTOPk_nSWaHIvC3mEZXtgWg_/?imgmax=800" width="644" height="477" /></a></p><p>If we try to query the sysjobhistory, using the following query, we would get 0 row:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />SELECT *<br />FROM dbo.sysjobhistory<br />WHERE job_id IN (<br /> SELECT job_id<br /> FROM dbo.sysjobs<br /> WHERE [name] = N'SQL Job Test');</pre><p>Or this query:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />EXEC sp_help_jobhistory @job_name = N'SQL Job Test';<br />GO<br /></pre><p>Gulp! No more history for SQL job “SQL Job Test”. But wait, if we look at the job properties, we can see when the job was last executed.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgwHZY8dfwIjkdxYjBwSRgECWY0Jqd6lZlm71CkBzcyvDv3tLx2Z4Irf0j81yEY04G_XPGSGVrKebduwkAJQl5WaxTP2K46Kjh6nJxHO6NCSGokRFR_TapurQvL5Z1iCzP0su3GzjSbI3r8/s1600-h/image%25255B21%25255D.png"><img title="image" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; display: inline; padding-right: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiXql629an5-UCzRjvnseWi6XdCMNOeLI-8Nj6Iix-drMD_WibfRabRL-TiVdVA_zx2vlUT5nHU3FP3vdzq5LgeSrWJ_PADupk_GI6HpRfSp5S0FNtiIpqObPiQDGUworV5X0nJJPPlgh5M/?imgmax=800" width="539" height="484" /></a></p><p>So the information must still be in the database. If we try to run the following query:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />EXEC sp_help_job @job_name = N'SQL Job Test';<br /></pre><p>We can see the job’s last run date, time and even outcome (For the definition of last_run_outcome column, you can check the following <a href="https://msdn.microsoft.com/en-us/library/ms186722.aspx" target="_blank">MSDN article</a>). There’s more. When you run the sp_help_job stored procedure, you can also see each of the job step’s last run date, time and outcome. </p><p>If we investigate it further, the last run date, time and outcome are stored on sysjobservers and sysjobsteps within the msdb database. We can query them using the following:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />SELECT j.job_id,<br /> j.Name,<br /> js.last_run_outcome,<br /> last_run_outcome_desc = CASE <br /> WHEN js.last_run_outcome = 0<br /> THEN 'Failed'<br /> WHEN js.last_run_outcome = 1<br /> THEN 'Succeeded'<br /> WHEN js.last_run_outcome = 3<br /> THEN 'Canceled'<br /> ELSE 'Unknown'<br /> END,<br /> js.last_outcome_message,<br /> last_run_datetime = dbo.agent_datetime(<br /> CASE WHEN js.last_run_date = 0 THEN NULL ELSE js.last_run_date END, <br /> CASE WHEN js.last_run_time = 0 THEN NULL ELSE js.last_run_time END),<br /> js.last_run_duration<br />FROM dbo.sysjobs j<br /> INNER JOIN dbo.sysjobservers js <br /> ON j.job_id = js.job_id<br /></pre><p>Or if you want to see it for each of the job’s steps:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />SELECT j.job_id,<br /> j.Name,<br /> js.last_run_outcome,<br /> last_run_outcome_desc = CASE <br /> WHEN js.last_run_outcome = 0<br /> THEN 'Failed'<br /> WHEN js.last_run_outcome = 1<br /> THEN 'Succeeded'<br /> WHEN js.last_run_outcome = 3<br /> THEN 'Canceled'<br /> ELSE 'Unknown'<br /> END,<br /> js.last_outcome_message,<br /> last_run_datetime = dbo.agent_datetime(<br /> CASE WHEN js.last_run_date = 0 THEN NULL ELSE js.last_run_date END, <br /> CASE WHEN js.last_run_time = 0 THEN NULL ELSE js.last_run_time END),<br /> js.last_run_duration,<br /> jst.step_id,<br /> jst.step_name,<br /> step_last_run_outcome = jst.last_run_outcome,<br /> step_last_run_outcome_desc = CASE <br /> WHEN jst.last_run_outcome = 0<br /> THEN 'Failed'<br /> WHEN jst.last_run_outcome = 1<br /> THEN 'Succeeded'<br /> WHEN jst.last_run_outcome = 2<br /> THEN 'Retries'<br /> WHEN jst.last_run_outcome = 3<br /> THEN 'Canceled'<br /> ELSE 'Unknown'<br /> END,<br /> step_last_run_datetime = dbo.agent_datetime(<br /> CASE WHEN jst.last_run_date = 0 THEN NULL ELSE jst.last_run_date END, <br /> CASE WHEN jst.last_run_time = 0 THEN NULL ELSE jst.last_run_time END)<br />FROM dbo.sysjobs j<br /> INNER JOIN dbo.sysjobservers js <br /> ON j.job_id = js.job_id<br /> INNER JOIN dbo.sysjobsteps jst<br /> ON j.job_id = jst.job_id<br /><br />WHERE j.name = N'SQL Job Test';</pre><h3>Clean-up</h3><p>To clean up the demo, you can run the following script:</p><pre class="brush: sql;">USE [msdb];<br />GO<br /><br />EXEC sp_delete_job @job_name = N'SQL Job Test';<br /><br />GO</pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-88086014568893610672015-03-24T20:20:00.001-04:002015-03-24T20:23:12.590-04:00File Share Witness<p>If you have cluster environment and using the file share witness, you can use the following command line (using cluster.exe) to get the location of your file share witness:</p> <pre class="brush: sql;">cluster.exe res "FileShare Witness" /priv</pre><p>Or if you prefer to use PowerShell, you can use:</p><pre class="brush: sql;">Get-ClusterResource "File Share Witness" | Get-ClusterParameter | fl</pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-7406130773485194232014-03-16T20:00:00.001-04:002014-03-16T20:00:43.999-04:00Microsoft Access–Reseed Auto Number<p>I was developing a small database using Microsoft Access 2010. In that database, I use Auto Number on some of the tables. During the development of the database, I’ve entered a lot of test data into those tables that have auto number.</p> <p>Then it comes the time to clean up the test data, so that the database can be used for the live data. Deleting the data from the tables is pretty straightforward. However, when I delete the data it does not reset any of the auto number columns that I have. In SQL Server, we could use the “Truncate” command to delete the data in the table and reset the identity column (the auto number) in the table. Or use DBCC CHECKIDENT to reseed the identity column. However, we could not use those in Microsoft Access.</p> <p>So to reseed auto number in Microsoft Access, you can do the following:</p> <ol> <li>Delete the data from the table</li> <li>Run the “Compact and Repair Database Tools”. It can be found under the “Database Tools” on Microsoft Access ribbon menu. <br /></li> </ol> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-81452046877799580102014-02-10T10:37:00.000-05:002014-02-10T10:50:20.636-05:00High Virtual Log Files (VLF) Count<p>Virtual log files are basically segments of the database log file. Every time the transaction log grow in size, the number of virtual log files would also grow. If you have a database with transaction log that keep growing in small increment each time, you can end up with a lot of virtual log files, and that can be a bad thing. </p> <p>In SQL Server 2012, if you have a database with more than 10,000 virtual log file, you will get a warning message on the SQL Server logs. To demonstrate this:</p> <p>First I would create a database called <strong>VLFTest</strong>. I set the log file for the database to have an initial size of 512 KB (for this size 2 VLF will be created).</p> <pre class="brush: sql;">USE [master];<br /><br />IF DB_ID('VLFTest') IS NOT NULL<br />BEGIN<br /> DROP DATABASE VLFTest;<br />END;<br />GO<br /><br />CREATE DATABASE VLFTest<br />ON (<br /> NAME = 'VLFTest_Data',<br /> FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\VLFTest_Data.mdf')<br />LOG ON (<br /> NAME = 'VLFTest_Log',<br /> FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\VLFTest_Log.ldf',<br /> SIZE = 512 KB,<br /> MAXSIZE = 10240 MB,<br /> FILEGROWTH = 512 KB);<br />GO</pre><p>Once the database is created, I can check the number of VLF created by running the following query:</p><pre class="brush: sql;">DBCC LOGINFO ('VLFTest');</pre><p>The query should return 2 rows, which would indicate that there are two VLFs created.</p><p>The next thing I would do is to grow the transaction log size by 512 KB increment for 5,000 times. Since each time we increment the transaction log by 512 KB, 2 new VLFs would be created. Thus, doing it for 5,000 times would cause it to generate 10,000 new VLFs. So at the end I would end up with 10,002 VLFs. I can do this by using the following script:</p><pre class="brush: sql;">DECLARE @CurrentSize AS INT,<br /> @LogFileQuery AS NVARCHAR(1000);<br /> <br />--- Get current log file size.<br />PRINT 'Current log file size is ' + CAST(@CurrentSize AS VARCHAR(1000)) + ' KB';<br />SELECT @CurrentSize = size*8 FROM sys.master_files WHERE database_id = DB_ID('VLFTest') AND name = 'VLFTest_Log'<br /><br />--- Try to grow the transaction log by 512 KB. This should result in two additional VLF.<br />PRINT 'Growing the transaction log to ' + CAST((@CurrentSize + 1) AS VARCHAR(1000)) + ' KB';<br />SET @LogFileQuery = 'ALTER DATABASE VLFTest MODIFY FILE (NAME = VLFTest_Log, SIZE = ' + CAST((@CurrentSize + 512) AS VARCHAR(1000)) + ' KB);';<br />EXEC (@LogFileQuery);<br />GO 5000</pre><p>Now if I run the following query:</p><pre class="brush: sql;">DBCC LOGINFO('VLFTest');</pre><p>I would get back 10,002 rows back, which means that I have 10,002 VLFs on <strong>VLFTest</strong> database’s transaction log.</p><p>Now, the warning in SQL Server log unfortunately will not be trigger when the VLF count goes beyond 10,000. I found out that it will be trigger under the following conditions:</p><ul><li>When the SQL Server is restarted </li><li>When the VLF database is set online (after it is set offline) </li><li>When we restore the database </li></ul><p>So for a quick test, I try to set the database offline and then back online by using the following query:</p><pre class="brush: sql;">USE master;<br />GO<br />ALTER DATABASE VLFTest SET OFFLINE;<br />GO<br />ALTER DATABASE VLFTest SET ONLINE;<br />GO</pre><p>After <strong>VLFTest</strong> database is back online, when I go to SQL Server log, I would get the following message:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGw8KL1IVnYZKa2RBr0p1mTZQ2Jxhk2_0Flokyqbcy5oZT8Pdd_yPM-bhBDrBvBuX-UEPiAJ6pQKSKiH20LbP9GLYobxEZ5yYYBN-3bz6q9fd5HuvX8s6OUoBsi9KUo-NLtmSyc1hLwank/s1600-h/image%25255B3%25255D.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6yXeBzQ9xrI7bKuNhQbgO4NIk-XiwhkZhRtOq1_Iicohseji_NQlwgNv-wiBlaTtAeUA7n52oIT4eGB1kIXmAJx-4fJ9R8IvP-HV3PxVXqBxmuEA5Zj07zEXNb6B1zY4f-nWld_p9eIlY/?imgmax=800" width="500" height="340" /></a></p><blockquote>Database VLFTest has more than 1000 virtual log files which is excessive. Too many virtual log files can cause long startup and backup times. Consider shrinking the log and using a different growth increment to reduce the number of virtual log files. </blockquote><h3>Why Having Too Much of VLFs Count Can Be Bad Thing?</h3><p>Having an excessive VLFs count can be a bad thing to your database. For one, when I tried to make the database back online after turning it offline, it took longer than expected. When I run the following query:</p><pre class="brush: sql;">USE master;<br />ALTER DATABASE VLFTest SET OFFLINE;<br />GO<br />SET STATISTICS TIME ON;<br />ALTER DATABASE VLFTest SET ONLINE;<br />GO</pre><p>After running it 5 times, on average it took about 23 seconds for the database to be online. Now compare this with the following:</p><pre class="brush: sql;">USE master;<br />GO<br />IF DB_ID('VLFTest2') IS NOT NULL<br />BEGIN<br /> DROP DATABASE VLFTest2;<br />END;<br />GO<br /><br />CREATE DATABASE VLFTest2<br />ON (<br /> NAME = 'VLFTest2_Data',<br /> FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\VLFTest2_Data.mdf')<br />LOG ON (<br /> NAME = 'VLFTest2_Log',<br /> FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\VLFTest2_Log.ldf',<br /> SIZE = 2560512 KB,<br /> MAXSIZE = 10240 MB,<br /> FILEGROWTH = 1024 KB);<br />GO</pre><p>It is basically the same database, but instead of growing the transaction log file from 512 KB to 2,560,512 KB by 512 KB increment (and in the process created 10,002 VLFs), I immediately create the transaction log with size 2,560,512 KB. By doing this, I only have 16 VLFs. Now when I try to run the following query:</p><pre class="brush: sql;">USE master;<br />ALTER DATABASE VLFTest2 SET OFFLINE;<br />GO<br />SET STATISTICS TIME ON;<br />ALTER DATABASE VLFTest2 SET ONLINE;<br />GO</pre><p>After running it 5 times, on average it took about 0.2 seconds to bring VLFTest2 database online. That’s quite a huge difference, 23 seconds vs. 0.2 seconds (about 2,200% difference).</p><p>I also noticed big difference in the time that it took to restore <strong>VLFTest1</strong> database (with 10,002 VLFs) compared to the <strong>VLFTest2</strong> database (with 16 VLFs). It took about 1 minutes and 35 seconds to restore <strong>VLFTest1</strong> database, but it only took about 6 seconds to restore <strong>VLFTest2</strong>.</p><p>In addition to the above, having an excessive number of VLFs in your database can also have an impact on the database transaction. A while ago, Linchi Shea wrote a blog post regarding this. It can be found <a href="http://www2.sqlblog.com/blogs/linchi_shea/archive/2009/02/09/performance-impact-a-large-number-of-virtual-log-files-part-i.aspx" target="_blank">here</a>.</p><h3>Take Away</h3><p>Having a high count of VLFs can hurt your database. To avoid it, we need to carefully plan our database size, especially in this case, the transaction log size. We also need to make sure that the transaction file growth size is not too small. This is to ensure that we don’t end up with many auto growth on the transaction log file, which cause the VLFs to grow.</p><h3>Further Reading</h3><p>You might also want to read Kimberly Tripp’s blog post regarding VLF. It can be found <a href="http://www.sqlskills.com/blogs/kimberly/transaction-log-vlfs-too-many-or-too-few/" target="_blank">here</a>.</p> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com3tag:blogger.com,1999:blog-8792628637495446545.post-67578246620764241552014-02-06T09:12:00.000-05:002014-02-06T09:12:17.216-05:00Utilizing smtp4dev with SQL Server Database Mail for Testing<p>SQL Server has database mail, a solution which would allow us to send e-mail from our database server. It allows us to send either e-mail in html or text format to users. It’s great because we can use it to send query results to users (or ourselves). We can also configure SQL Server Agent to use database mail to send alerts to operator.</p> <p>Database mail utilizes SMTP server to send the e-mails. In some cases, we might not have a readily available SMTP server in our environment (like in my home lab environment) or we might be block from using the SMTP server from our development environment for security reason. In those situations, I’ve been using <a href="http://smtp4dev.codeplex.com/" target="_blank">smtp4dev</a>. It is a neat little utility that would allow you to have a dummy SMTP server on your local computer. It is lightweight and also no installation is needed. When we run the utility, it would sit on the system tray and listen to port 25 (default SMTP port). It will receive the e-mail that get send to its way, without sending the e-mail to its final destination (the e-mail recipients). It is pretty handy when we need to run some tests.</p> <p>Setting up database mail to work with smtp4dev is pretty simple.</p> <ul> <li>Download smtp4dev from <a href="http://smtp4dev.codeplex.com">http://smtp4dev.codeplex.com</a>. </li> <li>Extract the file and save it on your system (assuming that it is your development system and it has the SQL Server). </li> <li>Run smtp4dev. Notes: If you have Windows Firewall turned on, you might get a “Windows Security Alert” that ask you if you want to allow smtp4dev to communicate with either the private network, public network or both. Since normally I run smtp4dev on my development system with also has SQL Server that I want to have database mail turn on, I would just hit the cancel button. </li> <li>If database mail in SQL Server has not been enable, we can enable it by using the following SQL script: </li> </ul> <pre class="brush: sql;">exec sp_configure 'show advanced', 1;<br />go<br />reconfigure;<br />go<br />exec sp_configure 'Database Mail XPs', 1;<br />go<br />reconfigure;<br />go</pre><ul> <li>We then need to set up database mail profile, account and then associate the profile with the account in SQL Server. To do that, we can use the following script (based on the “Database Mail Simple Configuration Template”): </li></ul><pre class="brush: sql;">DECLARE @profile_name sysname,<br /> @account_name sysname,<br /> @SMTP_servername sysname,<br /> @email_address NVARCHAR(128),<br /> @display_name NVARCHAR(128),<br /> @error_display NVARCHAR(500);<br /><br />-- Profile name. Replace with the name for your profile<br /> SET @profile_name = 'SMTP4Dev';<br /><br />-- Account information. Replace with the information for your account.<br /><br /> SET @account_name = 'SMTP4Dev Account';<br /> SET @SMTP_servername = 'localhost'; --- Since the smtp4dev is on local system. <br /> SET @email_address = 'john.doe@email.com';<br /> SET @display_name = 'John Doe';<br /><br />-- Verify the specified account and profile do not already exist.<br />IF EXISTS (SELECT * FROM msdb.dbo.sysmail_profile WHERE name = @profile_name)<br />BEGIN<br /> SET @error_display = 'The specified Database Mail profile (' + @profile_name + ') already exists.';<br /> RAISERROR(@error_display, 16, 1);<br /> GOTO done;<br />END;<br /><br />IF EXISTS (SELECT * FROM msdb.dbo.sysmail_account WHERE name = @account_name )<br />BEGIN<br /> SET @error_display = 'The specified Database Mail account (' + @account_name + ') already exists.';<br /> RAISERROR(@error_display, 16, 1) ;<br /> GOTO done;<br />END;<br /><br />-- Start a transaction before adding the account and the profile<br />BEGIN TRANSACTION ;<br /><br />DECLARE @rv INT;<br /><br />-- Add the account<br />EXECUTE @rv=msdb.dbo.sysmail_add_account_sp<br /> @account_name = @account_name,<br /> @email_address = @email_address,<br /> @display_name = @display_name,<br /> @mailserver_name = @SMTP_servername;<br /><br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to create the specified Database Mail account (' + @account_name + ').';<br /> RAISERROR(@error_display, 16, 1) ;<br /> GOTO done;<br />END<br /><br />-- Add the profile<br />EXECUTE @rv=msdb.dbo.sysmail_add_profile_sp<br /> @profile_name = @profile_name ;<br /><br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to create the specified Database Mail profile (' + @profile_name + ').';<br /> RAISERROR(@error_display, 16, 1);<br /> ROLLBACK TRANSACTION;<br /> GOTO done;<br />END;<br /><br />-- Associate the account with the profile.<br />EXECUTE @rv=msdb.dbo.sysmail_add_profileaccount_sp<br /> @profile_name = @profile_name,<br /> @account_name = @account_name,<br /> @sequence_number = 1 ;<br /><br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to associate the speficied profile with the specified account (' + @account_name + ').';<br /> RAISERROR(@error_display, 16, 1) ;<br /> ROLLBACK TRANSACTION;<br /> GOTO done;<br />END;<br /><br />COMMIT TRANSACTION;<br /><br />done:<br /><br />GO<br /></pre><ul><li>Now the moment of truth, we can test to see if it works by trying to send an e-mail using sp_send_dbmail </li></ul><pre class="brush: sql;">exec msdb..sp_send_dbmail @profile_name = 'SMTP4Dev',<br /> @recipients = 'jane.doe@email.com',<br /> @subject = 'Test E-mail',<br /> @body = 'This is a test email.';</pre><p>If everything works as plan hopefully on your tray notification, you would see the following message:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZ6O_7In-BStn9LcYQQNvzTTSPBi6AT8_9-nk19IIAUrhHgiFjoZRdjbS0_0ZDhjmULlhVBNN57GJK2BadBxKlHnV-KLQFq_yIfjL8T8BkgfBPzUvZ3e9neZU0vWPLrVWvAIgzOxA3KwB8/s1600-h/image2.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMu_fmFHskJaDXWldM3A_rwxgIrrHvJcWxeRxO1MWHHb0vOROh5kXk4kWvHBXVE-Rm4QCO39TVotr1MBpxhhhSuNuSC2MGMv9-R_vg5LB6_br0fURJteGU6UX0M0St2vPCTWDJD_Xzwr-7/?imgmax=800" width="235" height="149" /></a></p><p>And if you open smtp4dev, you should see the following:</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjjoVLJ0uOlOmWXSPBPjv11AfrA-2R7HUTirYxxS3r2B4m9OmK7lZKpEm4IDNQfKAVdllx3c_O3SXqwu94M8Vjv50z6QLRlHO8vQ3qdvwDyitfQxibK6G1NHvZ9ciTG89gKcih6oZ3Y2-If/s1600-h/image6.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgiT7rGiatrEvhBYKhUA_rFE2hnbyzaDX9F4oj6_zyOg8HR-6EZy-o7IqDIElEtJ-CFSguZbEb9bEUOR3xNqK4h6_quuv52rqH5yyn-I10e1RkJcqhsEhSk46sWT-FBlOYvYgipXdjxi5ZI/?imgmax=800" width="534" height="390" /></a></p><p>To view the e-mail, you can click on the “View” or “Inspect” button.</p><p>Some caveats that I’ve noticed:</p><ul><li>I have Microsoft Office 2013 installed on my system, which includes Outlook 2013. For some reason, when I click the “View” button, I would get an error message “The attempted operation failed. An object could not be found.” and it just would not open the e-mail (if I try to open it using Outlook 2013). It is ok if I use Windows Live Mail 2012. I would normally use the “Inspect” button instead. </li><li>If you exit out smtp4dev, the e-mails that you have on the Messages tab will get deleted. </li></ul><h3>Cleaning Up</h3><p>If you want to remove the database mail from your SQL Server, first you want to remove the profile association with account, the profile and the account. You can do that by using the following script:</p><pre class="brush: sql;">DECLARE @profile_name sysname,<br /> @account_name sysname,<br /> @error_display nvarchar(500);<br /><br />-- Profile name. Replace with the name for your profile<br /> SET @profile_name = 'SMTP4Dev';<br /><br />-- Account information. Replace with the information for your account.<br /> SET @account_name = 'SMTP4Dev Account';<br /><br />IF NOT EXISTS (SELECT * FROM msdb.dbo.sysmail_profile WHERE name = @profile_name)<br />BEGIN<br /> SET @error_display = 'The specified Database Mail profile (' + @profile_name + ') does not exists.';<br /> RAISERROR(@error_display, 16, 1);<br /> GOTO done;<br />END<br /><br />IF NOT EXISTS (SELECT * FROM msdb.dbo.sysmail_account WHERE name = @account_name)<br />BEGIN<br /> SET @error_display = 'The specified Database Mail account (' + @account_name + ') does not exists.';<br /> RAISERROR(@error_display, 16, 1);<br /> GOTO done;<br />END<br /><br />-- Start a transaction before deleting the profile account, profile and account.<br />BEGIN TRANSACTION ;<br /><br />DECLARE @rv INT;<br /><br />EXEC @rv=msdb.dbo.sysmail_delete_profileaccount_sp <br /> @profile_name = @profile_name, <br /> @account_name = @account_name;<br /><br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to delete the profile with the specified account (' + @account_name + ').';<br /> RAISERROR(@error_display, 16, 1) ;<br /> ROLLBACK TRANSACTION;<br /> GOTO done;<br />END;<br /><br />EXEC @rv=msdb.dbo.sysmail_delete_profile_sp<br /> @profile_name = @profile_name;<br /><br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to delete the database mail profile (' + @profile_name + ').';<br /> RAISERROR(@error_display, 16, 1) ;<br /> ROLLBACK TRANSACTION;<br /> GOTO done;<br />END;<br /><br />EXEC @rv=msdb.dbo.sysmail_delete_account_sp<br /> @account_name = @account_name;<br />IF @rv<>0<br />BEGIN<br /> SET @error_display = 'Failed to delete the database mail account (' + @account_name + ').';<br /> RAISERROR(@error_display, 16, 1) ;<br /> ROLLBACK TRANSACTION;<br /> GOTO done;<br />END;<br /><br />COMMIT TRANSACTION;<br /><br />done:<br /><br />GO</pre><p>Then after that, you can disabled the database mail by using the following script:</p><pre class="brush: sql;">exec sp_configure 'Database Mail XPs', 0;<br />go<br />reconfigure;<br />go<br />exec sp_configure 'show advanced', 0<br />go<br />reconfigure;<br />go</pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com2tag:blogger.com,1999:blog-8792628637495446545.post-44491012367648167872014-01-27T22:40:00.001-05:002014-01-27T22:49:23.953-05:00Unique Filtered Index<p>Consider a scenario where you need to create a table in which you need to store contact information, and for one of the columns, we need to store passport number for each contact, if one is available. Assuming that passport number is unique for each contact, but not all have passport, how can we ensure the uniqueness for the passport number enter while allowing null values for those contacts who don't have passport number?</p><p>Unique constraint/index would come to mind. However, the problem of using unique constraint/index in this scenario is while unique constraint/index would ensure that the data enter is unique, a column with unique constraint/index can only allow one null value in that column. </p><p>A quick demo for this using unique index (but the result should be similar if we use unique constraint as well):</p><pre class="brush: sql;">IF (OBJECT_ID('dbo.ContactInfo') IS NOT NULL)<br />BEGIN<br /> DROP TABLE dbo.ContactInfo<br />END<br /><br />CREATE TABLE dbo.ContactInfo (<br /> ContactID INT PRIMARY KEY IDENTITY (1,1),<br /> ContactName VARCHAR(100) NOT NULL,<br /> PassportNumber VARCHAR(25) NULL);<br /><br />CREATE UNIQUE INDEX UI_PassportNumber<br />ON dbo.ContactInfo(PassportNumber)<br /></pre><p>Now let's try to test to make sure that we can't enter duplicate passport number:</p><pre class="brush: sql;">INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('John Doe', 'B1234567890');<br />INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Jane Doe', 'B1234567890');<br /></pre><p>We would get the following messages:</p><blockquote><p>(1 row(s) affected)<br />Msg 2601, Level 14, State 1, Line 2<br />Cannot insert duplicate key row in object 'dbo.ContactInfo' with unique index 'UI_PassportNumber'. The duplicate key value is (B1234567890).<br />The statement has been terminated.</p></blockquote><p>Which is to be expected, it would insert the information for John Doe just fine, but then it would failed to insert the information for Jane Doe since Jane Doe’s passport number is the same as John Doe’s.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhs3Ss8xc60gxlgwjhyphenhyphenBRFJY0kH9Z4YWXNVmX6fve8oA7TYWX0hzPWqomx9dwoUkbsM7D820SaNhrWMyEwEBJR3UWGZSHatiXpKgZg1vlkDuOVwtOugdf2xFRm27yYt7arE1u1d8UcvibqC/s1600-h/image%25255B7%25255D.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiB00ADlcAuDY3c2tp5_jRQrDM0EeotjI5prlha4GTUAg2mJqbBbdFqC-jPAAhotDV5Q6gukY7yQeCHIkLBGT4tcRakzEowFhg8ak_tNmo2NV-DbaSSUqVsz7ebtRPuUULpT6RWa9A5nJjY/?imgmax=800" width="409" height="162" /></a></p><p>Now let’s test to make sure that we can enter multiple contact info with blank passport number.</p><pre class="brush: sql;">INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Mary Doe', NULL);<br />INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Jack Doe', NULL);<br /></pre><p>We would get the following messages:</p><blockquote><p>(1 row(s) affected)<br />Msg 2601, Level 14, State 1, Line 2<br />Cannot insert duplicate key row in object 'dbo.ContactInfo' with unique index 'UI_PassportNumber'. The duplicate key value is (<NULL>).<br />The statement has been terminated.</p></blockquote><p>It would insert the contact info for Marry Doe just fine, but it would fail when trying to insert the contact info for Jack Doe. This is because the unique constraint can allow only one null value.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9aqBcMuaHQKxJv2ZZjz0SCFwatjz9mSxCkLjN5f4qccXndWAUXLGoE7fHIYBo3IEcz8X6m-wEmcFFNhKjlQwekWNyHgcYDJBEogGORmAXkk0AJyxmGYQAbuDiRvMkUKF_cDG67hGAIv6t/s1600-h/image%25255B11%25255D.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyuidR9Obpb3RKdefDBzKyP3a1mujo6Xb6Tg_vVqM3ypKJKUt-cXTueFIisx_FMWNf1RjfvawRm0KKwSZINlQmYnRzF1P73bEzPk6BFi9Y10YZ0ywIOzp-UDdfgBqIgW4vLzIIrA10q2rd/?imgmax=800" width="393" height="174" /></a></p><h3>Possible Solution</h3><p>So how we can ensure that our passport column can accept null values while ensure uniqueness for the non-null value? With SQL Server 2008 and above, one possibility is by using the unique filtered index. A quick demo for this one:</p><pre class="brush: sql;">IF (OBJECT_ID('dbo.ContactInfo') IS NOT NULL)<br />BEGIN<br /> DROP TABLE dbo.ContactInfo<br />END<br /><br />CREATE TABLE dbo.ContactInfo (<br /> ContactID INT PRIMARY KEY IDENTITY (1,1),<br /> ContactName VARCHAR(100) NOT NULL,<br /> PassportNumber VARCHAR(25) NULL);<br /><br />CREATE UNIQUE INDEX UI_PassportNumber<br />ON dbo.ContactInfo(PassportNumber)<br />WHERE PassportNumber IS NOT NULL<br /></pre><p>Let’s try to see what happen if we try to enter contact info with duplicate passport number:</p><pre class="brush: sql;">INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('John Doe', 'B1234567890');<br />INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Jane Doe', 'B1234567890');<br /></pre><p>We would get the following messages:</p><blockquote><p>(1 row(s) affected) <br />Msg 2601, Level 14, State 1, Line 2 <br />Cannot insert duplicate key row in object 'dbo.ContactInfo' with unique index 'UI_PassportNumber'. The duplicate key value is (B1234567890). <br />The statement has been terminated.</p></blockquote><p>That works as expected. Only the record for John Doe that got inserted.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEwHSHD6mMQ7AUFDo34GtnromEAMsV2jS718SJEfisRO70M4idnIop30ngm3wVnEX8cJWPOWsMupOx8jPMWGXiVHKNmCwHDDC_qmhIm9KPNkKvRQnm7kQYJztGg-ODtSmt_-AgPQYiAt2k/s1600-h/image%25255B15%25255D.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOJhinI7_H6FbxTW8cuJgVWSBN-rLC2b2TDH9YeAVOrcvLIbmLw6242vUUJjz9FjI2bcLrbZUgEctEwzfR9_7TeMrz6q31gZJFyuEm442av__3nnc87PEsdqGBbpe6G9jXLB3MnGA7sGPv/?imgmax=800" width="448" height="139" /></a></p><p>Now let’s try to see what happen if we try to enter multiple contact info with blank passport number:</p><pre class="brush: sql;">INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Mary Doe', NULL);<br />INSERT INTO dbo.ContactInfo (ContactName, PassportNumber) VALUES ('Jack Doe', NULL);<br /></pre><p>We would get the following messages:</p><blockquote><p>(1 row(s) affected)</p><p>(1 row(s) affected)</p><br /></blockquote><p>Both records are being inserted just fine, which what we are expecting.</p><p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhd65311q4_i76TIghrLLFj5BjHotaF6EvAMsH5fUHPa7dG_7nsSaNaCIvP5WR-OqgmVf2hdQIWcHemDyHQVOlB3Xvy0w_BTrQD9D_elJNwkRJQYHxis8fyhoLtX9mf41u7yDtUGDNz-qM-/s1600-h/image%25255B19%25255D.png"><img title="image" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9VX83xkFqfkgsYdx99Q2Tlg4o7hRiDOZm5WpgzXiiVQtr5f_ap-zx76gKjN0fRyB8HmSMCOUeh1PZWOgZJwqc6Fj6YUwqKiC4oa3K2oyOQRhYFRAUhbFIBR5tL1Kq427Lni2XkBWvzpSk/?imgmax=800" width="436" height="186" /></a></p><h3>Conclusion</h3><p>We can use unique filtered index as a way to enforce uniqueness on filtered data. In the example above, it is to enforce uniqueness on a not null value on the passport number column. There is one caveat. Unlike unique constraint/index where it can be referenced by a foreign key constraint, unique filtered index can’t be referenced by a foreign key constraint.</p><p>So if we continue on from the above examples and try to create a new table named PassportVisit with the following definition:</p><pre class="brush: sql;">IF (OBJECT_ID('dbo.PassportVisit') IS NOT NULL)<br />BEGIN<br /> DROP TABLE dbo.PassportVisit<br />END<br /><br />CREATE TABLE dbo.PassportVisit (<br /> PassportNumber VARCHAR(25) FOREIGN KEY REFERENCES dbo.ContactInfo(PassportNumber),<br /> DateofVisit DateTime,<br /> CountryofVisit VARCHAR(50))<br /></pre><p>So we are trying to reference the passport number column on the passport visit table to the passport number on the contact info table. This will be successful if we have a unique constraint/index. But it will fail if we have unique filtered index. It will fail with the following messages:</p><blockquote><p>Msg 1776, Level 16, State 0, Line 6<br />There are no primary or candidate keys in the referenced table 'dbo.ContactInfo' that match the referencing column list in the foreign key 'FK__PassportV__Passp__73B0DA39'.<br />Msg 1750, Level 16, State 0, Line 6<br />Could not create constraint. See previous errors.</p></blockquote> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0tag:blogger.com,1999:blog-8792628637495446545.post-64754799847303552682013-06-09T22:33:00.001-04:002014-02-21T13:40:44.962-05:00List SSAS User Roles Using PowerShell (Part 2)<p>A while ago, I have a blog post in regards to using PowerShell to list user roles and also their members. The blog post can be found <a href="http://www.lucasnotes.com/2012/09/list-ssas-user-roles-using-powershell.html">here</a>. I’ve gotten a few questions about expanding the PowerShell script to:</p><ol><li>List user roles and their members for all databases within a particular SQL Server Analysis Services (SSAS) server.</li><li>Output the result into one table</li> </ol><p>Here’s a PowerShell script that can be use to accomplish those goals</p><pre class="brush: ps;">[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")<br /><br /># SSAS server name variable<br />$SSASServerName = "ServerName"<br /><br /># Try to connect to the SSAS server<br />$SSASServer = New-Object Microsoft.AnalysisServices.Server<br />$SSASServer.Connect($SSASServerName)<br /><br /># Object to store the result<br />$Result = @()<br /><br /># Get the SSAS databases and loop thru each of them<br />foreach ($DB in $SSASServer.Databases)<br />{<br /> # Get the SSAS database<br /> $SSASDatabase = $SSASServer.Databases.Item($DB.name)<br /><br /> # Get the roles available within the SSAS database and loop thru each of them<br /> foreach ($Role in $SSASDatabase.Roles)<br /> {<br /> # Get the members within the role and loop thru each one of them<br /> foreach ($UserName in $Role.Members)<br /> {<br /> # Create a new object that would store the database name, role name and member user name<br /> $ItemResult = New-Object System.Object<br /> $ItemResult | Add-Member -type NoteProperty -name DatabaseName -value $DB.Name<br /> $ItemResult | Add-Member -type NoteProperty -name RoleName -value $Role.Name<br /> $ItemResult | Add-Member -type NoteProperty -name UserName -value $UserName.Name<br /><br /> # Put the item result and append it to the result object<br /> $Result +=$ItemResult<br /> }<br /> }<br />}<br /><br />$Result | Select DatabaseName, RoleName, UserName | format-table -auto -wrap | Out-String</pre> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com3tag:blogger.com,1999:blog-8792628637495446545.post-5969533959920408192013-06-06T22:45:00.001-04:002013-06-06T22:45:44.333-04:00List SQL Server Analysis Services Database Properties Using PowerShell<p>If you have SQL Server Analysis Services (SSAS) and want to list the SSAS databases using PowerShell, here’s a PowerShell script that can do that:</p><pre class="brush: sql">[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")<br /><br /># Variable for your SSAS server name<br />$SSASServerName = "ServerName"<br /><br /># Connect to the SSAS server<br />$SSASServer = New-Object Microsoft.AnalysisServices.Server<br />$SSASServer.Connect($SSASServerName)<br /><br /># Try to list the SSAS database Name<br />$SSASServer.Databases | Select Name</pre><p>It turns out that you can do much more than listing your SSAS database name with this. You can get the SSAS database collation, compatibility level, created date, last processed, etc. To get the full list of properties that are available to you, try to run the following PowerShell script:</p><pre class="brush: sql">[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")<br /><br /># Variable for your SSAS server name<br />$SSASServerName = "ServerName"<br /><br /># Connect to the SSAS server<br />$SSASServer = New-Object Microsoft.AnalysisServices.Server<br />$SSASServer.Connect($SSASServerName)<br /><br /># Try to list the SSAS database properties that are available to you<br />$SSASServer.Databases | Get-Member -MemberType property | FT -wrap | Out-String</pre><p>So now, if I want to get a list of SSAS database name, collation, compatibility level, created data and last processed date, I can run the following PowerShell script:</p><pre class="brush: sql">[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")<br /><br /># Variable for your SSAS server name<br />$SSASServerName = "ServerName"<br /><br /># Connect to the SSAS server<br />$SSASServer = New-Object Microsoft.AnalysisServices.Server<br />$SSASServer.Connect($SSASServerName)<br /><br /># Try to list the SSAS Database Name, Collation, Compatibility Level, Created Time, Estimated Size, Last Processed Date <br />$SSASServer.Databases | Select Name, Collation, CompatibilityLevel, CreatedTimeStamp, EstimatedSize, LastProcessed | FT -wrap | Out-String</pre><h3>Related Blog Post:</h3><p><a href="http://www.lucasnotes.com/2012/09/list-ssas-user-roles-using-powershell.html">List SSAS User Roles Using PowerShell</a></p> Lucashttp://www.blogger.com/profile/14231394530002840714noreply@blogger.com0