Sep 142017
 

by @WidowPage

When I started writing this series of posts, this was the one I couldn’t wait to do.  But in addition to my fabulous and glamorous DBA job, I also coach my daughter’s volleyball team and that is about to kill me!  I felt like I couldn’t do this post the justice it deserved if I just whipped this post out so I waited until I had the time to do it right.

Of all the cure recipes I’ve written about, this one is the most fun because the beer really enhances the pork’s flavor.  If you were to taste the bacon, you wouldn’t be able to say “wow this bacon was cured in beer.”  And I’m sure there’s some science behind how it works, but for now just believe me when I saw that you should try this.

I got the recipe from here.  The ingredients are:

  • 5 lb of pork belly
  • 750ml beer
  • 1 quart of water
  • 1/2 cup kosher salt
  • 1/2 cup brown sugar
  • 1 tablespoon garlic powder
  • 1 tablespoon onion powder
  • 1 tablespoon black pepper
  • 2 bay leaves
  • 1 teaspoon pink salt (cure #1)

Let’s talk about the beer.  In the original recipe, the authors used a tripel.  In case you aren’t up on your brewing practices, tripels got their name because the recipe uses triple the normal amount of malt.  Modern brewers also use candy sugar and the resulting alcohol levels can be quite high.

But I’ve also never felt restricted just to tripels.  My local liquor store offers me this selection.

 

 

This selection prompted me to experiment.  I’ve tried stouts, porters, lagers, even a smoked beer.  I can’t tell too big a difference in the flavor but the pork will take the color of the beer.  For this round of bacon curing, I went rogue and tried a hard cider.  I picked Angry Orchard’s Walden Hollow which comes from Rome, Jonathan, Macintosh, Newton Pippin, Golden Russet, Rhode Island Greening apples.

In the first step of the recipe, you take the sugar and salt, boil them into water.

Once that water/sugar/salt mixture is cool, add the rest of the ingredients.  Mix well and add the pork belly.  Let it sit refridgerated for a week to 10 days.

Stay tuned:  Only 2 more posts to go!

 

Sep 052017
 
Most bacon recipes are either sweet or savory.  Yesterday, we did a cure with maple syrup, so today, we’ll go with a savory taste.  I got this recipe from here.  Assemble the ingredients into a large plastic bag and shake to mix.
  • 1/4 cup salt
  • 1/2 cup brown sugar
  • 2 tablespoons black pepper
  • 1/2 teaspoon ground bay
  • 1 teaspoon granulated onion
  • 1 teaspoon granulated garlic
  • 1/2 teaspoon ground thyme
  • 1 teaspoon pink salt

 

This recipe is for 3 to 4 lbs of pork belly.  Mine is right around 8 lbs so I just doubled it.  And yes, that is a lot of black pepper.
 
Here’s the before shot.
Here’s the after shot.  
Put the pork belly into a plastic bag and dump the extra spice mixture into the bag with the belly.  Seal it up. Wrap it with painter’s tape, label it and stick it into the fridge for 7-10 days.  It’s that easy.  
 
After today, we’ll get into some unusual cures.  

Once we do those cures, I’ll tell you about a field trip to a bacon counter here in Chicago and then we’ll smoke all the pork bellies and see the final product.

Aug 312017
 

By @WidowPage

This isn’t the way I intended to start this blog. I imagined my first post would be pictures unveiling huge boxes of soon-to-be bacon and describing the mouth watering ways I would convert it into bacon, but my pork belly order has yet to arrive. It was supposed to leave Iowa on Monday, August 21 and get here in Illinois the following day. It’s now Wednesday, August 30 and the magical box of pork belly has yet to appear. I’m hoping for the best and a call to the farmer did reassure me.

In the meantime, this delay gives us the chance to cover some background info. I’ve been a DBA for 15 years and met Bob working together at Morningstar. But you are here for the bacon, so let’s focus on that. Roughly 9 years ago, my dad cleaned out and welded an old oil drum into a smoker and presented it to my husband for a Christmas present. We joked that it was the largest smoker in our Chicago suburb. It’s huge and we could smoke 4 turkeys at once on it. My husband used it a lot and I assisted.

My Smoker

When my husband died, I ignored the smoker for a good year. I had plenty of other things to manage and it wasn’t on my radar. I finally resurfaced and started using it again for the customary brisket, hams and ribs.

About 4 years ago, I had an epiphany. I wanted to smoke something different and I decided to try bacon. I was surprised how easy it was and how much better it tasted than store bought. You just put the bacon in the cure, let it sit for a week and then cook it to about 140 degrees. That’s it. So simple and so much better than the stuff you get from Oscar Meyer.

At some point in my bacon adventures, I started researching heritage hog breeds and that lead me to the following article in the New York Times about the work Carl Blake was doing establishing a heritage hog farm in Iowa.  The description of the pork was mouth watering.

http://www.nytimes.com/2013/03/01/us/with-iowa-swabian-hall-a-farmers-quest-for-perfect-pig.html?mcubz=1

I contacted Blake begging for a pork belly. That was about three years ago. Last week he posted on his Facebook page that he had boxes of Mangalitsa bellies for sale. If you’ve not heard of Mangalitsa hogs, they are a Hungarian breed, rare in this country as the hogs don’t grow well in industrialized farm settings. Google “fuzzy pigs” and you’ll also learn that they are the Kobe beef of the hog world. Their high fat content means much more flavor than regular hogs and makes for excellent bacon.

Mangalitsa hog are also known as fuzzy pigs

As soon as I saw Blake’s post, I called him asking to buy 2 boxes of pork bellies. That’s 100lbs and is probably excessive but I’ve waited three years to get a chance to do this.

Knowing how much bacon is revered in DBA circles, I asked Bob if I could use his blog to share this bacon adventure and he obliged.  My plan is to to walk you through the entire curing and smoking process complete with photos of every step and the final product.  I hope you will stick around for the whole bacon process.

Postscript: I came home yesterday to two huge boxes on my porch. My next post will show you exactly what 100 lbs of pork belly looks like.

Apr 062017
 

At last month’s SQL Saturday in Chicago, we had two great distinctions:

  • We were SQL Saturday #600, a milestone! (But weren’t SQL Saturday numbers going away?)
  • We also were the first SQL Saturday to use the new logo, which is just one part of the major rebranding project undertaken by PASS last year.

At PASS Summit 2016, PASS announced several new logos as part of it’s rebranding campaign. There were new logos for PASS as a whole, SQL Saturday, 24 Hours of PASS, PASS Summit, and Business Analytics. In general they’re not terrible. I actually really like the new PASS logo. The old one just never made sense to me. Was it a St. Andrew’s Cross spider on top of a storm warning flag? (Since Sharknado was a hit, a movie about a hurricane full of spiders is a guaranteed blockbuster, right?) If there was any symbolism behind that old PASS logo, I’ve never heard it.

Old (left) and new (right) PASS logos

The new PASS logo, on the other hand, has a nifty story behind it. It’s all the different facets that make up PASS coming together into one. I like that.

But then there’s the new SQL Saturday logo.

Old (top) and new (bottom) SQL Saturday Chicago logos

SQL Saturday’s new logo is much more refined than it’s predecessor, and it’s very evident that a lot of effort went into these new logos and this rebranding as a whole. That being said, I think the new logo falls short in a few key areas.

First of all, there’s the symbol itself. It definitely works well with the PASS logo, but look at it. If you’re thinking in SQL about it (and since it’s a logo for SQL Saturday, that’s not too hard to imagine) you’ll see that it’s literally “<>”, the T-SQL operator for “not equals”. I have to imagine that whoever designed this logo was a graphic designer with absolutely no clue what SQL is, or what the logo they designed meant to people familiar with it. And I think that’s okay to a degree, but very early-on in this process someone at PASS probably should have looked at that and said “ya know, that looks like a not equals operator. Is that really how we want to symbolize SQL Saturday?”

Second, the typeface used in the new logo is much more modern, and the letters are significantly thinner than the old one. It looks great on a computer screen or when printed on paper, but think about what SQL Saturday logos are used for. In a lot of cases they are embroidered on things like shirts or jackets. Did whoever designed this logo know that? Once again I’ll assume they didn’t, otherwise they probably would have accounted for that.

The smaller a detail is, the more difficult/expensive it becomes to embroider, and this logo definitely qualifies. Just look at the PASS logo located inside the SQL Saturday logo. It’s microscopic. We had to use a bolder typeface for the speaker jackets we gave out in Chicago this year; it just didn’t look good otherwise. We also had to make it single-color, and remove the gradients. But by making all those changes, we technically changed the logo, which has become a big no-no in recent years. The SQL Saturday Wiki states: “Per the SQL Saturday license, the event logos provided to you by PASS are not to be altered in any way.” So if changes are necessary to be able to embroider this logo, but it can’t be altered in any way, does that mean organizers will have to run afowl of the license agreement? Or just do away with speaker shirts altogether? I don’t know, and I’m not sure there’s any way to tell.

SQL Saturday Chicago embroidered on jacket

It really doesn’t have to be this difficult though. This logo is still quite new, relatively unused, and there seems to be quite a few members of the community (SQL Saturday organizers in particular) who think this logo could use some work. Why not change it now? I’m sure there’s a way to come up with something that fits in with PASS’ new branding, is easier to embroider, and has a better message than “not equals.”

Aug 012016
 

I’m extremely fortunate to have been selected to speak at PASS Summit, “the world’s largest gathering of SQL Server and BI professionals.” PASS has once again put together a fantastic lineup, and I’m extremely proud to have made the cut. As many have already done, I’d like to share the abstracts I submitted along with the feedback I received.

I submitted a total of 5 general sessions (the maximum allowed), with one being accepted. I will cover each of them here, along with the notes I received from the reviewers.

Supercharging Backups and Restores For Fun and Profit (Accepted)

Level: 300
Track: Enterprise Database Administration & Deployment
Topic: Backup / Restore, Disaster Recovery

Abstract:
Super-fast queries are an essential part of any business process, but speed will never be more important than during a disaster when you need to restore from backup. Come and see how both backups and restores can be tuned just like a query. In this demo-intensive session, we will discuss the different phases of the backup and restore processes, how to tell how long each of them is taking, and which are the easiest to significantly speed up. You just might be surprised how simple it is to achieve dramatic results – cutting your backup and restore times by 75% or more is absolutely possible using the methods covered here.

Preqrequisites:
Attendees should have a solid understanding of SQL Server backup and restore operations.

Goals:

  • Learn tips and tricks for speeding up backup and restore processes and methods for tuning them that can have dramatic results.
  • Understand what happens during backups and restores, and which phases of their execution can have the most time shaved off of them.
  • Learn trace flags that expose extra information about the backup process and how to leverage this knowledge for maximum benefit.

Feedback I Received:

  • Abstract: well written, engaging
    Topic: draws attention
    Subjective: personally never been in a position where an emergency restore could be considered fun, but interested to here when it was
  • I would like to attend this session. The title is eye catching, The experience level is good but those DBA with less experience could attend and learn easily based on the information listed.
  • The outline seems well developed. The goals appear to be interesting for attendees. There appears to be a reasonable amount of live demonstrations in relation to the topic being presented.
  • high level, 75% of demo and minimum slides. And important topic. Very interesting session

My Comments:
I’m very happy this topic was accepted; I’ve presented it at several SQL Saturdays and have been wanting to to it at PASS Summit for several years now. Tuning queries is always seen as a common task and I’ve always thought that tuning backups and restores is a logical progression of that. In response to the first reviewer’s question, I think an emergency restore can be fun when you’re prepared for it. When you’ve practiced your disaster scenarios, have all your scripts ready, and know how long the restore will take, there’s not a whole lot left to be stressed about. As for the “high level” of demos, this is a demo-heavy session. You can only talk about backups for so long before it becomes worth it to actually start doing them. Not to mention it’s very helpful to show the audience how dramatic the results can be with some demos.

 

Good Migrations: Moving Maximum Data with Minimum Impact (Not Accepted)

Level: 300
Track:
Enterprise Database Administration & Deployment
Topic:
Database Maintenance

Abstract:
A database at rest tends to stay at rest, until it needs to move. This session will cover various methods available to migrate a SQL Server database from one location to another. Whether moving to a new storage system, a new server, or even to the cloud, there are a multitude of options available, many of which involve little to no user impact. Lack of SQL Server Enterprise Edition isn’t always a problem – many of these methods work for Standard Edition servers as well. We will discuss how to determine the most appropriate migration option based on your environment’s constraints, the pros and cons of each method, and planning and testing your migration. Come see how moving a multi-terabyte database with only a few minutes of downtime is completely possible.

Preqrequisites:
A good understanding of SQL Server files, filegroups, and index rebuild processes would be helpful.

Goals:

  • Be able to determine which migration method is most appropriate for given uptime requirements and organizational/environmental constraints.
  • Learn how to plan and test a database migration to maximize chances of success long before any queries are run.
  • Understand the many different techniques for moving databases, filegroups, and objects between different servers and/or storage, and the advantages and disadvantages of each.

Feedback I Received:

  • Abstract: Clearly written abstract with well aligned goals.
    Topic:Interesting topic that will attract DBA’s on the operations side of the fence.
    Subjective: I’d attend this session, as it sounds like a great topic.
  • Abstract – Outline is well developed. Level seems a bit high. Goals are well developed
    Topic – Title is good but would like to see if this is for which version of SQL 2012/2014/2016?
    Subjective – Would like to see presentation not only with moving data but imports as well aside from SQL Partitioning. Would like to see more demos but didn’t downgrade for that.
  • Abstract: detailed, compelling
    Topic: relevant, useful
    Subjective rating: interesting
  • Demo % seems to be low for 300 level session

My Comments:
Having worked on a system for many years that has grown more quickly than its storage budget, I’ve had to do a lot of creative things to move data around on-the-fly. This session covers a bunch of those tricks, which as you can imagine, end up being a little more interesting than a simple online index rebuild. I don’t include partitioning because that’s an entirely different topic and could easily take up an entire presentation on its own. As for the low amount of demos (25%), a lot of these operations are rather time-consuming and really wouldn’t fit well into a 75-minute session. I’d love to present this topic at the summit someday; I think attendees would get a lot out of it. Also I’ve yet to see something similar to this on the schedule, so it could definitely be something different.

 

Manage & Visualize Your Application Logs with Logstash & Kibana (Not Accepted)

Level: 200
Track:
Enterprise Database Administration & Deployment
Topic:
Management Tools

Abstract:
The logs kept by Windows, SQL Server, and other applications contain a treasure trove of information about the health and activities of a system. However, as an environment grows in size and complexity, the number of logs quickly starts to become unmanageable. Fortunately there is a group of free open-source tools: Elasticsearch, Logstash, and Kibana, known collectively as the “ELK” stack.

This session will demonstrate how to use Logstash to manage all application and error logs in your environment, regardless of format or operating system. You will learn how to configure Logstash to capture logs from SQL Server or any other system, organize and archive them in real-time with Elasticsearch, and create helpful web-based dashboards in Kibana. Don’t miss this opportunity to unlock the hidden power of all your application logs with the ELK stack!

Preqrequisites:
Attendees would benefit from a general understanding of the SQL Server error log and how it behaves.

Goals:

  • Learn about the components of the ELK stack, what they do, and how they interact with each other.
  • Understand how Logstash works and how to configure it to collect log information from any file format or logging method, using SQL Server error log files as an example.
  • See how to build dashboards in Kibana to quickly visualize errors and warnings across your environment.

Feedback I Received:

  • Abstract: Abstract is clear and well written.
    Topic: Topic is interesting and useful. Not sure if there would be enough demand for this topic.
    Subjective: I would like to attend this session. Seems like a good way to leverage other stacks for ease of admin.
  • Abstract: The outline and details of this abstract are well written!
    Topic: This is very interesting topic
    Subjective: I will attend this session
  • Well developed. I would like to attend this session.

My Comments:

This is absolutely a niche topic so I can understand why it wouldn’t get accepted. Sure sounds like the reviewers thought it was intriguing though. I run my ELK stack in Linux and use it to ingest system and application logs from a wide variety of machines. While this session would be more tailored to monitoring your SQL Server logs, it would also address monitoring virtually any log on any platform. This isn’t really database-centric, and certainly isn’t exclusive to SQL Server. While I think it would be very useful, I absolutely understand why this one didn’t make the cut.

 

Automating Your DBA Checklist with Policy-Based Management (Not Accepted)

Level: 200
Track:
Enterprise Database Administration & Deployment
Topic:
Policy Based Management

Abstract:
Manually reviewing database compliance checklists is an excellent way to ensure that processes are followed consistently, but it is also extremely time-consuming. Let’s automate the process! SQL Server’s Policy-Based Management is a powerful and simple-to-configure feature that can ensure that all of your best practices and data policies are consistently enforced throughout your environment.

Come see how easy it is to make sure all your SQL Servers comply with Microsoft’s recommendations or any other constraints your deployment requires. This session is loaded with demos to show you how to write policies, evaluate them across groups of instances, and even set up automated reporting so you can have a list of non-compliant servers delivered to you. Years after its introduction, Policy-Based Management is still one of SQL Server’s best-kept secrets. Attend this session and learn how to work smarter, not harder, by leveraging Policy-Based Management to simplify your day-to-day tasks!

Preqrequisites:
Attendees should have a basic understanding of SQL Server administration, maintenance processes, and why they are necessary.

Goals:

  • Understand the capabilities of Policy-Based Management and how it can be used to uniformly enforce settings and other aspects of SQL Server.
  • Learn how to author policies, evaluate them both manually and automatically across multiple servers, and configure automated reporting of them using the Enterprise Policy Management Framework.
  • Leave with a checklist of best practices to automate on your servers, as well as knowledge of Microsoft’s included scripts that can help get you started.

Feedback I Received:

  • The outline seems to clearly describe the contents of the presentation. The title appears to reflect the content described in the abstract. The topic and goals should be compelling to attendees. The topic and goals appear to deliver an appropriate amount of material for the time allotted.
  • Abstract: clearly stated, interesting
    Topic: good title
    Subjective: interesting subject, and something I use often
  • good content. It would draw people to attend this session.
  • Very interesting topic, From one perspective is a basic of basics but from another we still need teach how to use PBM.

My Comments:
Policy-Based Management is incredibly useful in that it allows you to easily author “sanity checks” to make sure your databases are in compliance with whatever standards the business decides are necessary. However PBM isn’t really sexy and it’s certainly not that new – it’s had very few changes since it was released along with SQL Server 2008. As one reviewer said “it’s a basic of basics”. It is, but so many systems I see still don’t use it, typically because the DBA isn’t aware of it. From what I can tell, no sessions covering PBM were chosen this year. That’s a shame, because it could help a lot of people. But in an industry where new things always get the most attention, and at a conference with a finite number of presentation slots, it’s understandable why you won’t see any sessions on it.

 

SHA, Right! SQL Server Encryption Basics (Not Accepted)

Level: 200
Track:
Enterprise Database Administration & Deployment
Topic:
Security: Access / Encryption / Auditing / Compliance

Abstract:
High-profile attacks by hackers have made the news more and more the past few years, and your database is a prized target! Fortunately SQL Server offers many possible layers of protection, one of which is encryption. This session will cover SQL Server’s various encryption capabilities, how they work, and their advantages and limitations.

You will learn what certificates are and why they matter, which encryption algorithms are available and which should be used, and how Transparent Database Encryption works and when to enable it. More recent features such as backup encryption and SQL Server 2016 Always Encrypted will also be explained. Restoring servers and recovering data can be thought of as difficult, but they are nothing compared to rebuilding your customers’ trust and repairing your reputation. Attend this session and learn how SQL Server can help you protect your data from prying eyes both inside and outside of your organization.

Preqrequisites:
Attendees should have basic knowledge of SQL Server and a desire to learn about encryption.

Goals:

  • Learn about all the different ways SQL Server can protect your data through encryption.
  • Understand the strengths and weaknesses of each encryption technology, and the scenarios where each would be an appropriate solution.
  • Learn tips for designing databases where security through encryption is a prerequisite, not an afterthought.

Feedback I Received:

  • Encryption. Important and lovely topics. Worth to see it!
  • Abstract: detailed
    Topic: relevant, sql server 2016 is covered
    Subjective rating: interesting
  • OK, I’m in the dark — what is SHA?
  • Abstract – Good detail in abstract. Great opener and strong conclusion.
    Topic – Good goals. Attendees will be interested and seems compelling for attendees even if they don’t know in-depth security or encryption.
    Subjective – This is a great abstract. Session Prerequisites and Level match and since its previously presented the topic should be able to fit within the time frame allowed.
  • Abstract: it’s punny! good topic
    Topic: well written and informative of what will be covered and why
    Subjective: definitely interested in this session
  • Abstract: Great abstract supported by clearly defined goals. Abstract goes into an appropriate level of detail on deliverables.
    Topic:Great topic. Encryption is an ongoing concern and likely to be a solid draw.
    Subjective: I would attend this session Sounds like a great introductory conversation.

My Comments:
All the other sessions I submitted had 3 or 4 pieces of feedback (I’m assuming from 3 or 4 people). This one has 6! Encryption is a hot topic as of late, I wonder if that has something to do with the reviewer interest in this session. This is a rather basic presentation, and while it’s done rather well at several SQL Saturdays, I’m not sure it would be as popular at the summit anyway. Not being chosen kind of solidified my thoughts. Having a few sessions with deeper dives on a more narrow scope would probably be more popular, though I doubt any of those sessions would cover the basics in the depth that I do here.

 

Thanks so much to the members of the Program Committee who volunteered their time to review abstracts. I know they do not have an easy time reviewing or selecting sessions for the schedule. (As a member of the Program Committee for several years now, I can speak from experience.) I value all feedback, and look forward to incorporating it into any future submissions.