Join Me for PASS Data Expert Series!

I’m very honored to have one of my PASS Summit sessions chosen for the PASS Data Expert Series, taking place this Thursday, February 7th!

This is an awesome opportunity to see some of the top-rated sessions from this year’s PASS Summit. Included is my session on SQL Server Management Studio, which ended up being the second-best attended session of the entire conference. I will be available to answer any questions that come up during the presentation.

Sound awesome? Register today! I hope to see you there!


Speaking at SQL Saturday Olso 2018

I always feel honored when chosen to present at a SQL Saturday event. Being selected is never a guarantee, especially these days when organizers seemingly have more speakers and abstracts to pick from than ever before. But I am just over-the-moon happy to share that I was picked to speak at SQL Saturday in Oslo, Norway coming up on September 1. Norway has been on my list of places to visit for years, and I really can’t wait. Thank you so much to the SQL Saturday Oslo organizing team for putting together an amazing schedule and for allowing me to be a part of it – this is going to be a fantastic event!

My presentation at this event is entitled “Select Stars: A SQL DBA’s Introduction to Azure Cosmos DB”. I’ve been working with Azure Cosmos DB for a while now, and it’s really an incredible product. It’s been generating lots of buzz as of late, but there are still plenty of DBAs out there who have yet to use it. This is an introductory-level session that focuses on explaining what Azure Cosmos DB is, how it works, what its strengths are, and how they can be leveraged. To anyone curious about Azure Cosmos DB, this is the session for you!

Registration for SQL Saturday Oslo is now open – register today before it fills up! If you would like to extend your SQL Saturday experience even further, they are also offering four pre-conference sessions on Friday August 31.

I am incredibly stoked to visit Norway and I hope I’ll see you there in just three short months!

(Syndicated from Bob’s home blog site at bobpusateri.com)


Importing Data With PowerShell and dbatools

I like to use public datasets for experimentation and presentation demos, especially data that people can easily understand and relate to. For some, keeping them up-to-date was a manual process of downloading files, loading tables, and merging. There are of course many better ways to do this, some of which are more automated than others. I could have simply used PowerShell to call bcp, or even just implemented an insert statement and some loops. Then I found dbatools, which has commands which enable me to do an even better job with far less work – just the way I like it!. Here’s how I now keep my datasets current:

Getting The Data

I’ll be using data from the City of Chicago’s Data Portal. They have a tremendous online resource with lots of public datasets available. One that I really like is their listing of towed vehicles. Any time the city tows or impounds a vehicle, a record gets added here and remains for 90 days. It’s very manageable, with only 10 columns and a few thousand rows. (As an added bonus, you can search for license plates you know and then ask your friends about their experience at the impound lot!)

Chicago’s data portal uses Socrata, which is a very well-documented and easy-to-use tool for exposing data. It has a wonderful API for querying and accessing data, but to keep things simple for this post we’re just going to download a CSV file.

If you’re on the page for a dataset, you can download it by clicking on “Export” on the top right and then selecting “CSV”. To avoid all that, the direct link to download a CSV of this dataset is here. Download it and take a look at what we’ve got using your spreadsheet or text editor of choice (mine is Notepad++).

Loading The Data

We’ve got our data, now let’s load it. I like to load the entire downloaded dataset into a stage table, and then copy new rows I haven’t previously seen into my production table that I query from. Here’s the script to create these tables:

-- CREATE STAGE TABLE

CREATE TABLE [dbo].[TowedVehiclesSTG](

 [TowDate] [date] NOT NULL,

 [Make] [nchar](4) NULL,

 [Style] [nchar](2) NULL,

 [Model] [nchar](4) NULL,

 [Color] [nchar](3) NULL,

 [Plate] [nchar](8) NULL,

 [State] [nchar](2) NULL,

 [TowedToFacility] [nvarchar](75) NULL,

 [FacilityPhone] [nchar](14) NULL,

 [ID] [int] NOT NULL

);

-- CREATE FINAL TABLE

CREATE TABLE [dbo].[TowedVehicles](

 [ID] [int] NOT NULL,

 [TowDate] [date] NOT NULL,

 [Make] [nchar](4) NULL,

 [Style] [nchar](2) NULL,

 [Model] [nchar](4) NULL,

 [Color] [nchar](3) NULL,

 [Plate] [nchar](8) NULL,

 [State] [nchar](2) NULL,

 [TowedToFacility] [nvarchar](75) NULL,

 [FacilityPhone] [nchar](14) NULL,

CONSTRAINT PK_TowedVehicles PRIMARY KEY CLUSTERED (ID)

);

Now for the magic – let’s load some data! The dbatools command that does all the heavy lifting here is called Import-DbaCsvToSql. It loads CSV files into a SQL Server table quickly and easily. As an added bonus, the entire import is within a transaction, so if an error occurs everything gets rolled back. I like to specify my tables and datatypes ahead of time, but if you want to load into a table that doesn’t exist yet, this script will create a table and do its best to guess the appropriate datatype. To use, simply point it at a CSV file and a SQL Server instance, database, and (optionally) a table. It will take care of the rest.

# Load from CSV into staging table

Import-DbaCsvToSql -Csv $downloadFile -SqlInstance InstanceName -Database TowedVehicles -Table TowedVehiclesSTG `

-Truncate -FirstRowColumns

The two parameters on the second line tell the command to truncate the table before loading, and that the first line of the CSV file contains column names.

Now the data has been staged, but since this dataset contains all cars towed over the past 90 days, chances are very good that I already have some of these tows in my production table from a previous download. A simple query to insert all rows from staging into production that aren’t already there will do the trick. This query is run using another dbatools command, Invoke-Sqlcmd2.

# Move new rows from staging into production table

Invoke-Sqlcmd2 -ServerInstance InstanceName -Database TowedVehicles `

-Query "INSERT INTO [dbo].[TowedVehicles]

SELECT

 [ID],

 [TowDate],

 [Make],

 [Style],

 [Model],

 [Color],

 [Plate],

 [State],

 [TowedToFacility],

 [FacilityPhone]

FROM (

 SELECT

 s.*,

 ROW_NUMBER() OVER (PARTITION BY s.ID ORDER BY s.ID) AS n

 FROM [dbo].[TowedVehiclesSTG] s

 LEFT JOIN [dbo].[TowedVehicles] v ON s.ID = v.ID

 WHERE v.ID IS NULL

) a

WHERE a.n = 1"
The ID column uniquely identifies each tow event, and the production table uses it as a primary key, however I have found that occasionally the dataset will contain duplicated rows. The ROW_NUMBER() window function addresses this issue and ensures each ID is attempted to be inserted only once.

Putting it all together

I’ve showed you how simple dbatools makes it to load a CSV file into a table and then run a query to load from staging into production, but the beauty of PowerShell is that it’s easy to do way more than that. I actually scripted this entire process, including downloading the data! You can download the full PowerShell script, along with a T-SQL Script for creating the tables, from my GitHub here.

Happy Data Loading!

This post was cross-posted from Bob’s personal technical blog at bobpusateri.com.


Bob Pusateri presenting at Chicago Suburban SQL Server Users Group

Bob Pusateri from our team is proud to be presenting at the Suburban Chicago SQL Server Users Group on April 17th at 6pm a session called “Locks, Blocks, and Snapshots: Maximizing Database Concurrency.”

Abstract: The ability for multiple processes to query and update a database concurrently has long-been a hallmark of database technology, but this feature can be implemented in many ways. This session will explore the different isolation levels supported by SQL Server and Azure SQL Database, why they exist, how they work, how they differ, and how In-Memory OLTP fits in. Demonstrations will also show how different isolation levels can determine not only the performance, but also the result set returned by a query. Additionally, attendees will learn how to choose the optimal isolation level for a given workload, and see how easy it can be to improve performance by adjusting isolation settings. An understanding of SQL Server’s isolation levels can help relieve bottlenecks that no amount of query tuning or indexing can address – attend this session and gain Senior DBA-level skills on how to maximize your database’s ability to process transactions concurrently.

RSVP for this event today!


Prevent Presentation Disasters by Pausing Windows Updates

It’s no secret that Windows 10 likes to forcibly apply updates, and occasionally these forced updates can occur at very inopportune times. I’ve heard tales of forced updates ruining demonstrations and presentations not only at SQL Saturdays, but also at this year’s PASS Summit.

What most people don’t seem to realize is that this type of disaster can be prevented rather easily. Here’s how to prevent surprise updates in Windows 10:

Have The Right Version

All the settings shown here first appeared in the Windows 10 Creators Update, version 1703. By this point you probably have it on your machine (in fact, as I write this, the Fall 2017 Creators Update, version 1709, is now being pushed out to users.) But if you’re unsure of your windows version and want to check, right-click on the start button and choose “system” to bring up information about your PC.

 

So long as you have version 1703 or later, you’re good!

Advanced Windows Update Options

Navigate to this menu by clicking on the Start button and searching for “Advanced Windows Update Options”.

Once this window opens, you’ll see several options that can be configured:

(click to enlarge)

Towards the bottom is the silver tuna: the “Pause Updates” switch. As the very helpful description states, pausing updates stops their installation for up to 35 days (unless you un-pause it sooner). Once those 35 days are up, you must bring your device up-to-date before you are allowed to pause again.

Moving up from the Pause Updates switch are two drop downs that allow deferment of updates. “Feature updates”, which are released semi-annually, can be delayed for up to a year. “Quality updates”, the cumulative monthly updates, can be deferred by up to a month.

Moving up again, you will find the Windows servicing channel. The channels available to you depend on which edition of Windows 10 you are running, and a matrix is available here. Adding to the confusion is the fact that Microsoft is changing the names of the channels in the Windows 10 Fall 2017 Creators Update (version 1709).

The Current Branch is considered appropriate for home users, and receives updates as soon as they are available. Beginning in version 1709 this will be renamed to “Semi-Annual Channel (Targeted)”.

The Current Branch for Business receives updates a few months after they are made available, and is directed towards business users. Beginning in version 1709 this will be known as the “Semi-Annual Channel”.

Thank you, Microsoft, for taking some pretty straight-forward names and changing them to be nearly identical and way more confusing.

My Recommendations

Here’s what I do: if I have a presentation or other important event coming up, I pause updates a week or two beforehand. This gives me adequate time to test everything in a stable environment as well as the peace of mind that unexpected updates won’t occur. I also allow my feature and quality updates to be deferred by up to 15 days. That way, if an update does pop up out of nowhere, I’m not forced to install it immediately.

This post is syndicated from Bob’s personal blog site at bobpusateri.com.


24 Hours of PASS: Data Security and Quality Wrapup

Thank you to all who attended my session on “Passive Security for Hostile Environments” back on the 3rd of this month. I consider it an honor to be part of such a wonderful lineup. I just received my evaluations and comments, and am very happy to report that the results were extremely positive. Thank you very much to the people who took the time to rate my presentation and offer feedback, which I will include at the end of this post.

I was also very surprised to hear that my session had 193 attendees, which puts it in the top five in terms of attendance – wow!

If you weren’t able to attend but would like to check it out, a recording is now available.

My slide deck is available for download on the 24 Hours of PASS site.

Demo scripts and other resources are available here.

Feedback

Sessions were evaluated based on four questions, and I received 78 total responses.

  1. How would you rate this session overall?
    Excellent: 60    Good: 18     Average: 0     Fair: 0     Poor: 0
  2. How would you rate the speakers’ presentation skills?
    Excellent: 64     Good: 13     Average: 0     Fair: 0     Poor: 0
  3. How would you rate the speakers’ knowledge of the subject?
    Excellent: 71      Good: 6      Average: 0     Fair: 0     Poor: 0
  4. Did you learn what you expected to learn from this session?
    Agree: 67     Somewhat Agree: 7     Neutral: 3     Somewhat Disagree: 0     Disagree: 0

I also received the following comments:

  • excellent demos and real deep dive into the details of each area he covered.
  • Your demo scenarios were very effective in showing the strengths and weaknesses of each option. Well done.
  • great use of demos!
  • Thank you!
  • Great presentation. Good demos – be great to get a copy of the scripts.
  • I had a lot of familiarity with DDL/DML triggers and Event Notifications. There were some new aspects you showed that I had not considered. Impersonation, for instance. Policy based management is something I haven’t used, but have read about. The session helped reinforce what I’ve learned in the past. Slide decks are great. But I prefer live demos and the code. You had a good, complementary mix of both.
  • Wow, this was incredibly good! So well organized. You covered a lot of territory.
  • Great overview of the different tech
  • Very clear explanations and demos, great pace for a webinar. Packed full of useful examples for real projects. Thank you!

This post is syndicated from BobPusateri.com.