May 182018
 

Chicago SQL Association logoIf you’ve been a SQL Saturday organizer for a few years, you might remember that prior to March of 2014, PASS handled much of the finances. All money collected from sponsors and meal fees went into a trust account managed by PASS which was earmarked for that specific event. Organizers then requested their funds from PASS.

I completely understand why PASS would want to get out of doing this. PASS handles a lot of things, and to be honest, playing the role of banker for events organized and managed by volunteers doesn’t need to be a part of that.

While I believe this was the right thing to do, it did create some additional headaches for organizers, particularly in terms of taxes depending on local laws. PASS is a not-for-profit organization and could collect this money without any tax liability. With PASS removed from the equation, event organizers now had to establish a PayPal account in their own name to collect funds for SQL Saturday, and they were also personally liable for any taxes on it.

The user group leadership in Chicago didn’t really like that idea, so we decided to incorporate as a not-for-profit ourselves, creating the Chicago SQL Association. Over the years, I’ve been contacted by several other user groups asking how we did this. I’m happy to help, and thought I would share here as well.

Choose A Mission and Scope

Any good project needs a mission and a scope, and the same goes for an organization. We decided our scope would be to support SQL Saturday Chicago, as well as the Chicago SQL Server User Group and the Chicago Suburban SQL Server User Group. Our mission, simply put, is education. We offer free knowledge about the Microsoft data platform to the public through user group meetings and our SQL Saturday event.

Seek Professional Assistance

Creating a corporation isn’t all that difficult. I have no doubt that we could have fumbled our way through it and ended up with a not-for-profit corporation on our own. What I was not so sure of was our ability to get this right the first time without making mistakes that might penalize us in the future. We looked into hiring an attorney to help us navigate this process, but it was going to end up costing thousands of dollars that the organization didn’t have.

One day it dawned on me to look into legal clinics at local law schools. These clinics, often free or available at little cost, offer law students the ability to gain practical experience while helping those who otherwise could not afford legal services. I reached out to the Bluhm Legal Clinic at Northwestern University Pritzker School of Law, and they got back to me very quickly. We were assigned three law students and a professor (also a lawyer) who helped us through the process from beginning to end.

After meeting with our legal team and discussing our goals, mission, and scope, they came to the conclusion that we should seek 501(c)(3) not-for-profit status. They drew up all the paperwork, and all we had to do was sign on the dotted lines and pay the required government filing fees. Within a matter of weeks we had a legitimate not-for-profit corporation, with zero sleep lost worrying that we made any mistakes.

After Incorporation

Now for the real work. As I said, creating a corporation is pretty easy. Maintaining it and keeping things legal is another matter entirely. It goes without saying, but, depending on your locale, you’ll probably have multiple required filings to keep your not-for-profit in compliance. Speaking for ourselves, each year we must file simplified taxes with the US Internal Revenue Service, register with the Illinois Secretary of State, and report our finances to the Illinois Attorney General.

We had done a pretty good job of bookkeeping before we incorporated, but now that we were official, it was required by law. We originally kept track of all our finances via shared spreadsheets. This worked decently well and cost us nothing, but come tax time reporting was a nightmare. Of course, there was plenty of commercial software out there for managing business finances, but none of it was cheap. This is where finding organizations that help not-for-profits is invaluable.

I highly recommend applying to TechSoup. TechSoup offers a variety of discounted (and free) technology products to not-for-profits. One of the many things they offer is a significant discount on QuickBooks small business accounting software. It makes the financial part of running a business incredibly simple, and it honestly saved us about 20 hours of work when it came time to do taxes and financial reporting this year.

You’ll also want to register with PayPal as a not-for-profit to get a discounted rate. This means that more of the money collected from sponsorships and meal fees will go into your bank account. Every little bit helps!

A Web Presence

You may want to look into setting up a domain, website, and/or email for your organization. For us, this was probably the most difficult part of the process. I was unable to find any way of obtaining a discounted or free web domain, but that wasn’t a huge problem as domains are rather inexpensive. Hosting a website and email, however, is another story.

We applied to Microsoft’s nonprofit program with the thought that we would meet their criteria in that we are offering free educational opportunities to the public, and furthermore, those opportunities center around Microsoft products. Sounds like a shoo-in, right? Yeah, not so much. Microsoft did not agree and hence we were rejected.

We then tried Google’s nonprofit program, which accepted us. We’re grateful they did, as we can now host association-related email at our own domain, and are working on a website. The irony is not lost on me that Google is helping us educate people about the Microsoft data platform while Microsoft had no interest in doing so.

Other Tips

Your organization will need some sort of bank account. Be sure to shop around at multiple banks to see what they offer not-for-profits, and what their fees are. If you have a local credit union, don’t forget to check them out as well. We ended up going with a credit union and pay no fees at all for our banking.

Get liability insurance. If your organization is running a SQL Saturday, chances are your venue may already require insurance, but even if they don’t, a policy is never a bad idea to have. We found a company that specializes in insuring activities like user groups, and it costs much less than we thought it would.

Looking Back

I’m very grateful we were able to incorporate the Chicago SQL Association as a not-for-profit. Not only have we seen financial benefits, but we have also become more organized as a result of having a corporation to maintain. PASS no longer handling the financial end of SQL Saturday events may have been a short-term annoyance, but it was just the push we needed to get this ball rolling and I’m very glad we did!

Apr 232018
 

I’ve found myself working with PowerShell more frequently as of late, in no small part due to the amazing dbatools module. This has led to me trying to figure out how else I can utilize it for some of my personal internal processes.

I like to use public datasets for experimentation and presentation demos, especially data that people can easily understand and relate to. For some, keeping them up-to-date was a manual process of downloading files, loading tables, and merging. There are of course many better ways to do this, some of which are more automated than others. I could have simply used PowerShell to call bcp, or even just implemented an insert statement and some loops. Then I found dbatools, which has commands which enable me to do an even better job with far less work – just the way I like it!. Here’s how I now keep my datasets current:

Getting The Data

I’ll be using data from the City of Chicago’s Data Portal. They have a tremendous online resource with lots of public datasets available. One that I really like is their listing of towed vehicles. Any time the city tows or impounds a vehicle, a record gets added here and remains for 90 days. It’s very manageable, with only 10 columns and a few thousand rows. (As an added bonus, you can search for license plates you know and then ask your friends about their experience at the impound lot!)

Chicago’s data portal uses Socrata, which is a very well-documented and easy-to-use tool for exposing data. It has a wonderful API for querying and accessing data, but to keep things simple for this post we’re just going to download a CSV file.

If you’re on the page for a dataset, you can download it by clicking on “Export” on the top right and then selecting “CSV”. To avoid all that, the direct link to download a CSV of this dataset is here. Download it and take a look at what we’ve got using your spreadsheet or text editor of choice (mine is Notepad++).

Loading The Data

We’ve got our data, now let’s load it. I like to load the entire downloaded dataset into a stage table, and then copy new rows I haven’t previously seen into my production table that I query from. Here’s the script to create these tables:

-- CREATE STAGE TABLE
CREATE TABLE [dbo].[TowedVehiclesSTG](
	[TowDate] [date] NOT NULL,
	[Make] [nchar](4) NULL,
	[Style] [nchar](2) NULL,
	[Model] [nchar](4) NULL,
	[Color] [nchar](3) NULL,
	[Plate] [nchar](8) NULL,
	[State] [nchar](2) NULL,
	[TowedToFacility] [nvarchar](75) NULL,
	[FacilityPhone] [nchar](14) NULL,
	[ID] [int] NOT NULL
);


-- CREATE FINAL TABLE
CREATE TABLE [dbo].[TowedVehicles](
	[ID] [int] NOT NULL,
	[TowDate] [date] NOT NULL,
	[Make] [nchar](4) NULL,
	[Style] [nchar](2) NULL,
	[Model] [nchar](4) NULL,
	[Color] [nchar](3) NULL,
	[Plate] [nchar](8) NULL,
	[State] [nchar](2) NULL,
	[TowedToFacility] [nvarchar](75) NULL,
	[FacilityPhone] [nchar](14) NULL,
	CONSTRAINT PK_TowedVehicles PRIMARY KEY CLUSTERED (ID)
); 

Now for the magic – let’s load some data! The dbatools command that does all the heavy lifting here is called Import-DbaCsvToSql. It loads CSV files into a SQL Server table quickly and easily. As an added bonus, the entire import is within a transaction, so if an error occurs everything gets rolled back. I like to specify my tables and datatypes ahead of time, but if you want to load into a table that doesn’t exist yet, this script will create a table and do its best to guess the appropriate datatype. To use, simply point it at a CSV file and a SQL Server instance, database, and (optionally) a table. It will take care of the rest.

# Load from CSV into staging table
Import-DbaCsvToSql -Csv $downloadFile -SqlInstance InstanceName -Database TowedVehicles -Table TowedVehiclesSTG `
-Truncate -FirstRowColumns

The two parameters on the second line tell the command to truncate the table before loading, and that the first line of the CSV file contains column names.

Now the data has been staged, but since this dataset contains all cars towed over the past 90 days, chances are very good that I already have some of these tows in my production table from a previous download. A simple query to insert all rows from staging into production that aren’t already there will do the trick. This query is run using another dbatools command, Invoke-Sqlcmd2.

# Move new rows from staging into production table
Invoke-Sqlcmd2 -ServerInstance InstanceName -Database TowedVehicles `
-Query "INSERT INTO [dbo].[TowedVehicles]
SELECT
  [ID],
  [TowDate],
  [Make],
  [Style],
  [Model],
  [Color],
  [Plate],
  [State],
  [TowedToFacility],
  [FacilityPhone]
FROM (
  SELECT
    s.*,
    ROW_NUMBER() OVER (PARTITION BY s.ID ORDER BY s.ID) AS n
  FROM [dbo].[TowedVehiclesSTG] s
  LEFT JOIN [dbo].[TowedVehicles] v ON s.ID = v.ID
  WHERE v.ID IS NULL
) a
WHERE a.n = 1"

The ID column uniquely identifies each tow event, and the production table uses it as a primary key, however I have found that occasionally the dataset will contain duplicated rows. The ROW_NUMBER() window function addresses this issue and ensures each ID is attempted to be inserted only once.

Putting it all together

I’ve showed you how simple dbatools makes it to load a CSV file into a table and then run a query to load from staging into production, but the beauty of PowerShell is that it’s easy to do way more than that. I actually scripted this entire process, including downloading the data! You can download the full PowerShell script, along with a T-SQL Script for creating the tables, from my GitHub here.

Happy Data Loading!

Apr 122018
 

Next week I am happy to be presenting for the Chicago Suburban SQL Server User Group, which meets in Downers Grove, IL.

I will be delivering my session Locks, Blocks, and Snapshots: Maximizing Database Concurrency. This takes a look at SQL Server’s available isolation levels, what they are, how they differ, and why the default setting of “read committed” may not be appropriate for your workload. It should not only be helpful, but lots of fun as well.

I hope you can join us! The meeting is on Tuesday, April 17, and you can register and find further details here!

Mar 262018
 

I’m very pleased to be returning to Wisconsin in April to speak at SQL Saturday Madison 2018! This has always been one of my favorite events. Not only is it close to home, but it’s in a wonderful location and full of great friends. Thank you so much to the organizing committee for all of the hard work you do!

This year I will be presenting Locks, Blocks, and Snapshots: Maximizing Database Concurrency. This light-hearted and informative session will take a look at SQL Server’s available isolation levels, what they are, how they differ, and why the default setting of “read committed” may not be appropriate for your workload. It should not only be helpful, but lots of fun as well. I hope you will join me!

As of the time of posting, SQL Saturday Madison registration is still open. If you’d like to see some awesome presentations on Microsoft data technologies, register today!

Mar 122018
 

I think one of the reasons I like computers is that they’re so much easier to deal with than people. They’re predictable, emotions never get in the way, and there’s always a logical explanation for everything if you dig deep enough. All that being said, I am grateful for my hobbies that don’t involve sitting in front of a computer all day. Thinking through the things I enjoy in my spare time, three central themes that seem to keep popping up are art, architecture, and railroads. Here’s two examples of that:

Turner Family Christmas Cards

The University of Illinois will always be a special place to both me and my family. I graduated from there, was a proud member of the marching band, and got my start in public speaking there. It’s also the place where I met my wife, and to top things off we were married on campus. We have plenty of Illinois memrobilia of varying degrees of rarity, but the collection I’m most proud of is something so obscure that most Illini fans won’t even know they exist.

Fred Turner spent nearly 50 years of his life at the University of Illinois, beginning as a student in 1918, and ending with his retirement in 1966 as the university’s first Dean of Students. Turner loved Illinois and its history, and in 1946 he and his wife, Betty, decided to highlight these topics in Christmas cards they sent to their friends and family. Turner had recently picked up the Japanese art of woodblock printing, and decided to share it by creating a hand-carved woodblock print of a historical site in Illinois each year as the cover of their Christmas card. For an extra tie to campus, the blocks for printmaking were carved from wood salvaged from one of the University’s earliest buildings. They printed 260-275 by hand at their kitchen table each year until 1974.

The cards contain a main image on the front, a brief description and Christmas message from the Turners on the inside, and a fact sheet with more information on the historical building or site chosen. Here’s what they are like:

Turner Christmas Cards
I can only imagine that most of these cards have been lost to a combination of time and people just not knowing what they are. I know of only two complete sets of all 29 cards. As for my own collection, I’ve got a long way to go – I only have three. I found my first one five years ago, and have only come across two more since. Our trips to Central Illinois frequently involve trips to flea markets and antique shops, and now you know exactly what I look for.

If you’re curious to see what more of these cards look like, the university archives has a website showing cards from different years.

Railroad Photography

I’m guessing you’ve never heard of O. Winston Link. He was a civil engineer by degree and a photographer by trade, who in 1955 undertook a personal side project which would end up becoming his life’s work. The Norfolk and Western Railway was the last major railroad in the United States to still use steam locomotives at that point, and Link obtained full permission from their president to photograph the last days of the steam era on their railroad, primarily in Virginia and West Virginia coal country. He was even given a key to the railroad’s switch boxes, allowing him to phone dispatchers to obtain exact arrival and departure times for trains and, occasionally, the ability to request a train be held so a photo could be taken at a particular moment.

Hawksbill Creek Swimming Hole. Luray, VA. August 9, 1956.

Link’s photos were not ad-hoc snapshots in any way. The majority of them were taken at night (making him one of the first to popularize nighttime photography), and illuminated by enormous flash arrays he built himself. Link said that not only did nighttime photos romanticize the trains further, but steam locomotive smoke appears white, and therefore not dirty, when lit by flash. Utilizing his background in advertising and marketing photography, his photos were carefully composed to tell a story, and almost always incorporate people to add a humanizing element.

I wish I could tell you I collect his photos, but that’s not possible. I can only afford to be an enthusiast. Original prints typically sell for thousands of dollars at auction. I do own all of his books, which are absolutely beautiful. The tales he tells in black and white are richer than anything I could ever imagine in color.

Link lived in relative obscurity, his photos were popular with railfans but didn’t start to gain mainstream popularity until decades later. You won’t find many photos of him – he preferred to stay behind the camera, but he did have a cameo appearance in one of my all-time favorite movies, October Sky. If you’ve never seen that movie, add it to your list!

I like to say that if you watch October Sky, and you recognize the train engineer to be O. Winston Link, you’re really smart and observant. But if you’re a true rail nerd, you’ll also know that the locomotive he’s driving is a 2-8-2 “Mikado” #4501, which was actually owned by the Southern Railway and painted to be a Norfolk & Western engine just for the movie. The N&W never used 2-8-2 locomotives.

If you made it this far, thank you for reading about my hobbies. I promise to return to my regularly scheduled technobabble in short order.

This month we celebrate the 100th edition of T-SQL Tuesday, originally created by Adam Machanic back in November 2009. Adam had asked us to write about what topics we might be covering 100 months into the future. I’m not good at predicting the future, but in the spirit of time travel, I am going one month into the past and writing about last month’s topic which was hosted by Aaron Bertrand.