Today’s Monday Morning Mistake issue is another painfully common one that we all run into at some point. You’ve got some new code given to you by a developer that’s supposed to be ran against QA (because you’re a good DBA and you don’t grant developers access to anything outside of Development). Part of the code drops some existing tables and recreates them with new schemas, objects, etc. You hit the execute button when the blood drains from your face when you realize the instance you were connected to was a production instance!
You have multiple instances of SQL Server to manage and you want a visual indicator in SQL Server Management Studio (SSMS) to let you quickly know which environment you’re connected to.
Pragmatic Works has now released version 3.5 of the award-winning BIxPress software! For those not familiar with BIxPress it’s a tool that helps you develop SSIS/SSAS solutions faster, easily/quickly deploy SSIS packages, monitor performance SSIS packages and much, much more!
So what’s new with 3.5? In addition to now having SQL Server 2012 support (SSIS), you may have noticed in previous released that Reporting Services didn’t get much love. That all changes with this release as we now have incorporated health monitoring of Reporting Services instances called Reporting Performance Monitor!The new dashboard includes Average Runtime for Reports, Longest Running Reports, Report Memory Usage, Average number of Rows, Active Users, et cetera:
Another great feature in this release is the Report Deployment Tool. This feature lets you quickly and easily deploy your Reports, Folder Structures, and Data Sources between Reporting Services instances!
One more major update in this release is the update to the SSIS Package Performance monitoring interface. Same great insight, new cleaner interface!
To quickly summarize the issue, there’s a 20-core limit in place with Enterprise edition (UPDATE – Thanks for this clarification point Aaron: to be clear, the 20-core restriction *only* applies if you upgrade Server + CAL via SA. With core limit = licensed. In other words if you buy a 64-core Enterprise license, you get to use all 64 cores.)! In a nutshell what that means is if you have a server with 4 8-core processors for a total of 32 cores, and you install SQL Server 2012 on it licensed previously by CAL with SA, SQL Server will only “see”/use 20 of those cores! This is a huge deal and one I’m really surprised has not been addressed more vocally from the user community. I’ve already seen a couple of statements as strong as “based on this, we will seriously start looking at another platform”. My hope is that if enough noise is made from the customer base, Microsoft will at least up that limit similar to how VMware changed their licensing for vSphere 5 based on customer lashback. Have you or your company run into this issue yet? Let me hear your thoughts in the comments.
Yet Another Update: Per Steve Jones’ request in comments, here’s a visual indicator of scenarios and how it could affect you
Licensing in 2012
Can I use all my cores?
Per Processor (4 procs, 4 cores)
License all 16 cores (buy 8 core packs*)
Per Processor (4 procs, 8 cores)
License all 32 cores (buy 16 core packs)
Grandfathered Server (4×4 cores) + CALs (with SA)
No core packs purchased due to agreement
Grandfathered Server (4×8 cores) + CALs (with SA)
No core packs puchased due to agreement
No (limit to 20)
Server (4×8 cores) + CALs (with SA)
License 32 cores (buy 16 core packs)
*Core pack comes in pairs so 16 cores requires 8 packs to be purchased, etc.
Existing Enterprise Edition licenses in the Server + CAL licensing model that are upgraded to SQL Server 2012 and beyond will be limited to server deployments with 20 cores or less. This 20 core limit only applies to SQL Server 2012 Enterprise Edition Server licenses in the Server + CAL model and will still require the appropriate number/versions of SQL Server CALs for access.
Addendum: Please note, this blog (or any other) should NOT be your definitive source for licensing information. For that always, ALWAYScontact your local Microsoft rep as they have the details of your specific agreements and options. This post is meant for informational purposes only.
I just wrapped up my 24 Hours of PASS session on consolidation. A big THANK YOU again to everyone in attendance, who kindly put up with my horrendous rapping “skills”! As promised, below is the link to the slide deck. In the presenter’s notes you’ll find some good links and resources for consolidation.
Welcome back to another addition of Monday Morning Mistakes series. Today’s issue is one I tend to run into quite often with clients and is an important topic to know about as a database administrator. Without further ado, let’s get to our issue
You have SQL Server database engine installed on a system with other services such as Analysis Services, Reporting Services and/ or Integration Services and you constantly seem to run out of memory. Restarting service seems to fix the issue temporarily but some time later the same problem returns.
Expressions in SSIS are great. They allow you to create dynamic values for all sorts of stuff like variables, connection strings, properties for almost anything, etc. One huge issue that tends to trip up a lot of folks, especially those new to SSIS, is the evaluation of those expressions when using variables.
You create an SSIS variable with an expression but at runtime the expression is not evaluating as expected. Instead the expression value is using the default/static value of the variable.
Make sure property for ‘Evaluate as an Expression” on the variable is set to True. Without it being set to true, the variable will evaluate to the hard set value.
Inspired by common emails and questions I see, I figured I’d do a series of blog posts on common mistakes folks make with SQL Server called Monday Morning Mistakes (or #sqlM3 for short, since we all love quick hashtags these days). These are meant as quick fixes, nothing too comprehensive. Also since I just made up a hashtag, feel free to share your own #sqlM3 tips on Twitter anytime! Without further ado…
Today’s quick issue: Can connect to SQL Server locally but can’t connect from other server or computer.
Quick answer: Remote connections (read also: any connections that are not local) to SQL Server are disabled by default. This behavior is default in SQL Server 2005 and higher. You have to manually enable TCP/IP protocol to instance to allow connectivity. This requires a service restart to take effect.
I’m in the middle of a database migration and thought I’d quickly share a script I threw together to show estimated time of completion for a database restore in progress on SQL Server. The script will also show you estimated time for database backups to complete as well.
Please don’t take this script as gospel, the best way to truly know how long restores will take is to actually perform a restore! Remember folks:
Backups are worthless, restores are pricless
estimated_completion_time /60/1000 as estimate_completion_minutes,
DATEADD(n,(estimated_completion_time /60/1000),GETDATE()) as estimated_completion_time
FROM sys.dm_exec_requests where command = 'BACKUP DATABASE' OR command = 'RESTORE DATABASE'
NOTE: Due to the fact this script uses DMV’s, will only work on SQL Server 2005 and higher
This week we’re going to talk about a topic that has been gaining steam in the last few years and as it has it has started impacting database administrator’s worlds more and more: virtualization. Why do I make this statement? Well since the economy currently sucks, shops are finding ways to consolidate and make their dollars stretch a little further. Back in the day when you had a new application you pretty much went out and bought yourself some new servers and went on your merry way. Now, when money’s tight, folks are a little less likely to go out and simply buy new equipment for each individual application. Not only is this option expensive, there are other factors to think about such as space (data center may not have capacity for new servers), electricity and cooling.
Enter virtualization. Virtualization allows you to consolidate this server sprawl issue by buying a physical server, filling it with tons of your typical resources such as CPU, memory and drives, and from this single box be able to create virtual servers on this single piece of hardware that look/act/feel like independent servers. This week we’re going to cover some basics of virtualization and stuff you need to know about if you’re going to be going that route in your shop.